[ 531.797528] env[60679]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 532.269961] env[60722]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 533.804520] env[60722]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=60722) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 533.804843] env[60722]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=60722) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 533.804958] env[60722]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=60722) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 533.805245] env[60722]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 533.806284] env[60722]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 533.923781] env[60722]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=60722) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 533.934632] env[60722]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.011s {{(pid=60722) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 534.031500] env[60722]: INFO nova.virt.driver [None req-64d3ac5e-1eea-4185-ad43-eb3b33f6e8f0 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 534.105301] env[60722]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 534.105469] env[60722]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 534.105537] env[60722]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=60722) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 537.306725] env[60722]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-31b0a525-b8c6-40a1-9dc3-853989b9fed5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.318779] env[60722]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=60722) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 537.318953] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-eda177cd-8004-464c-b2b7-7be011d53493 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.344304] env[60722]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 21d4f. [ 537.344472] env[60722]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.239s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 537.345072] env[60722]: INFO nova.virt.vmwareapi.driver [None req-64d3ac5e-1eea-4185-ad43-eb3b33f6e8f0 None None] VMware vCenter version: 7.0.3 [ 537.348461] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8758b10-4415-4931-94e3-733d4503f678 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.370682] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a36f2aeb-ec73-4099-8251-665971b71913 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.376587] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d755669a-0640-48ad-b762-99421ca80575 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.383267] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-496b2e96-4046-4f7a-9dda-8842661cfd45 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.396254] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-838fa536-e183-4733-839a-f3fb19a4da17 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.402250] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23cb587e-6b80-42ac-ad04-d9bc8fc7d34a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.432312] env[60722]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-0b7397a7-a2df-4a9f-aea0-c9eaaea632ab {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.437546] env[60722]: DEBUG nova.virt.vmwareapi.driver [None req-64d3ac5e-1eea-4185-ad43-eb3b33f6e8f0 None None] Extension org.openstack.compute already exists. {{(pid=60722) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 537.440148] env[60722]: INFO nova.compute.provider_config [None req-64d3ac5e-1eea-4185-ad43-eb3b33f6e8f0 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 537.456907] env[60722]: DEBUG nova.context [None req-64d3ac5e-1eea-4185-ad43-eb3b33f6e8f0 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),e91aa2a0-c000-44bb-af40-cd4144808ec8(cell1) {{(pid=60722) load_cells /opt/stack/nova/nova/context.py:464}} [ 537.458760] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 537.458976] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 537.459707] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 537.460084] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] Acquiring lock "e91aa2a0-c000-44bb-af40-cd4144808ec8" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 537.460409] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] Lock "e91aa2a0-c000-44bb-af40-cd4144808ec8" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 537.461235] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] Lock "e91aa2a0-c000-44bb-af40-cd4144808ec8" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 537.473162] env[60722]: DEBUG oslo_db.sqlalchemy.engines [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=60722) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 537.478947] env[60722]: ERROR nova.db.main.api [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 537.478947] env[60722]: result = function(*args, **kwargs) [ 537.478947] env[60722]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 537.478947] env[60722]: return func(*args, **kwargs) [ 537.478947] env[60722]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 537.478947] env[60722]: result = fn(*args, **kwargs) [ 537.478947] env[60722]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 537.478947] env[60722]: return f(*args, **kwargs) [ 537.478947] env[60722]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 537.478947] env[60722]: return db.service_get_minimum_version(context, binaries) [ 537.478947] env[60722]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 537.478947] env[60722]: _check_db_access() [ 537.478947] env[60722]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 537.478947] env[60722]: stacktrace = ''.join(traceback.format_stack()) [ 537.478947] env[60722]: [ 537.480016] env[60722]: DEBUG oslo_db.sqlalchemy.engines [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=60722) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 537.482710] env[60722]: ERROR nova.db.main.api [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 537.482710] env[60722]: result = function(*args, **kwargs) [ 537.482710] env[60722]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 537.482710] env[60722]: return func(*args, **kwargs) [ 537.482710] env[60722]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 537.482710] env[60722]: result = fn(*args, **kwargs) [ 537.482710] env[60722]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 537.482710] env[60722]: return f(*args, **kwargs) [ 537.482710] env[60722]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 537.482710] env[60722]: return db.service_get_minimum_version(context, binaries) [ 537.482710] env[60722]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 537.482710] env[60722]: _check_db_access() [ 537.482710] env[60722]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 537.482710] env[60722]: stacktrace = ''.join(traceback.format_stack()) [ 537.482710] env[60722]: [ 537.483311] env[60722]: WARNING nova.objects.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 537.483311] env[60722]: WARNING nova.objects.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] Failed to get minimum service version for cell e91aa2a0-c000-44bb-af40-cd4144808ec8 [ 537.483605] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] Acquiring lock "singleton_lock" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 537.483763] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] Acquired lock "singleton_lock" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 537.483996] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] Releasing lock "singleton_lock" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 537.484318] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] Full set of CONF: {{(pid=60722) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 537.484454] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ******************************************************************************** {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 537.484575] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] Configuration options gathered from: {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 537.484706] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 537.484892] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 537.485019] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ================================================================================ {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 537.485222] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] allow_resize_to_same_host = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.485385] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] arq_binding_timeout = 300 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.485511] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] backdoor_port = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.485633] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] backdoor_socket = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.485792] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] block_device_allocate_retries = 60 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.485947] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] block_device_allocate_retries_interval = 3 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.486121] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cert = self.pem {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.486281] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.486445] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] compute_monitors = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.486608] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] config_dir = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.486776] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] config_drive_format = iso9660 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.486919] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.487092] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] config_source = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.487259] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] console_host = devstack {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.487418] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] control_exchange = nova {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.487570] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cpu_allocation_ratio = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.487724] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] daemon = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.487886] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] debug = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.488044] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] default_access_ip_network_name = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.488205] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] default_availability_zone = nova {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.488355] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] default_ephemeral_format = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.488580] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.488737] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] default_schedule_zone = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.488891] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] disk_allocation_ratio = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.489056] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] enable_new_services = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.489226] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] enabled_apis = ['osapi_compute'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.489381] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] enabled_ssl_apis = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.489542] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] flat_injected = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.489693] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] force_config_drive = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.489846] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] force_raw_images = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.490039] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] graceful_shutdown_timeout = 5 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.490181] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] heal_instance_info_cache_interval = 60 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.490387] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] host = cpu-1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.490551] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.490706] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] initial_disk_allocation_ratio = 1.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.490862] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] initial_ram_allocation_ratio = 1.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.491074] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.491236] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] instance_build_timeout = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.491393] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] instance_delete_interval = 300 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.491552] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] instance_format = [instance: %(uuid)s] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.491713] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] instance_name_template = instance-%08x {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.491869] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] instance_usage_audit = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.492039] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] instance_usage_audit_period = month {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.492200] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.492360] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] instances_path = /opt/stack/data/nova/instances {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.492521] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] internal_service_availability_zone = internal {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.492676] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] key = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.492831] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] live_migration_retry_count = 30 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.492985] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] log_config_append = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.493156] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.493307] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] log_dir = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.493456] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] log_file = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.493579] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] log_options = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.493737] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] log_rotate_interval = 1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.493898] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] log_rotate_interval_type = days {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.494071] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] log_rotation_type = none {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.494197] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.494316] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.494473] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.494627] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.494752] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.494908] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] long_rpc_timeout = 1800 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.495077] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] max_concurrent_builds = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.495233] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] max_concurrent_live_migrations = 1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.495385] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] max_concurrent_snapshots = 5 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.495534] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] max_local_block_devices = 3 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.495688] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] max_logfile_count = 30 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.495843] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] max_logfile_size_mb = 200 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.495997] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] maximum_instance_delete_attempts = 5 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.496170] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] metadata_listen = 0.0.0.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.496333] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] metadata_listen_port = 8775 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.496495] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] metadata_workers = 2 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.496652] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] migrate_max_retries = -1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.496812] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] mkisofs_cmd = genisoimage {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.497017] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] my_block_storage_ip = 10.180.1.21 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.497153] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] my_ip = 10.180.1.21 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.497307] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] network_allocate_retries = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.497479] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.497638] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] osapi_compute_listen = 0.0.0.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.497796] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] osapi_compute_listen_port = 8774 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.497961] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] osapi_compute_unique_server_name_scope = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.498132] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] osapi_compute_workers = 2 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.498288] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] password_length = 12 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.498440] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] periodic_enable = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.498592] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] periodic_fuzzy_delay = 60 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.498752] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] pointer_model = usbtablet {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.498925] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] preallocate_images = none {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.499116] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] publish_errors = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.499216] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] pybasedir = /opt/stack/nova {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.499365] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ram_allocation_ratio = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.499517] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] rate_limit_burst = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.499674] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] rate_limit_except_level = CRITICAL {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.499828] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] rate_limit_interval = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.499978] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] reboot_timeout = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.500140] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] reclaim_instance_interval = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.500287] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] record = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.500445] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] reimage_timeout_per_gb = 60 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.500601] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] report_interval = 120 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.500778] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] rescue_timeout = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.500935] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] reserved_host_cpus = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.501097] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] reserved_host_disk_mb = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.501251] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] reserved_host_memory_mb = 512 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.501404] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] reserved_huge_pages = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.501556] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] resize_confirm_window = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.501709] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] resize_fs_using_block_device = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.501857] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] resume_guests_state_on_host_boot = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.502024] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.502177] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] rpc_response_timeout = 60 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.502331] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] run_external_periodic_tasks = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.502492] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] running_deleted_instance_action = reap {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.502658] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] running_deleted_instance_poll_interval = 1800 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.502809] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] running_deleted_instance_timeout = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.502962] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] scheduler_instance_sync_interval = 120 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.503099] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] service_down_time = 300 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.503261] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] servicegroup_driver = db {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.503414] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] shelved_offload_time = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.503565] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] shelved_poll_interval = 3600 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.503724] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] shutdown_timeout = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.503877] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] source_is_ipv6 = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.504036] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ssl_only = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.504271] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.504432] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] sync_power_state_interval = 600 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.504588] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] sync_power_state_pool_size = 1000 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.504747] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] syslog_log_facility = LOG_USER {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.504895] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] tempdir = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.505055] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] timeout_nbd = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.505217] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] transport_url = **** {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.505370] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] update_resources_interval = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.505522] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] use_cow_images = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.505674] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] use_eventlog = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.505826] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] use_journal = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.505975] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] use_json = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.506136] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] use_rootwrap_daemon = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.506287] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] use_stderr = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.506437] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] use_syslog = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.506584] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vcpu_pin_set = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.506743] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vif_plugging_is_fatal = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.506902] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vif_plugging_timeout = 300 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.507072] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] virt_mkfs = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.507230] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] volume_usage_poll_interval = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.507387] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] watch_log_file = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.507550] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] web = /usr/share/spice-html5 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 537.507733] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_concurrency.disable_process_locking = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.508029] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.508210] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.508374] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.508538] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.508704] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.508863] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.509046] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.auth_strategy = keystone {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.509227] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.compute_link_prefix = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.509375] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.509543] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.dhcp_domain = novalocal {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.509880] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.enable_instance_password = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.509880] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.glance_link_prefix = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.510020] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.510184] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.510340] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.instance_list_per_project_cells = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.510495] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.list_records_by_skipping_down_cells = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.510651] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.local_metadata_per_cell = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.510816] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.max_limit = 1000 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.510990] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.metadata_cache_expiration = 15 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.511173] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.neutron_default_tenant_id = default {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.511334] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.use_forwarded_for = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.511490] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.use_neutron_default_nets = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.511651] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.511811] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.511969] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.512147] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.512311] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.vendordata_dynamic_targets = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.512478] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.vendordata_jsonfile_path = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.512659] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.512845] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.backend = dogpile.cache.memcached {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.513012] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.backend_argument = **** {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.513180] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.config_prefix = cache.oslo {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.513340] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.dead_timeout = 60.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.513496] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.debug_cache_backend = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.513651] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.enable_retry_client = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.513804] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.enable_socket_keepalive = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.513966] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.enabled = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.514136] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.expiration_time = 600 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.514290] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.hashclient_retry_attempts = 2 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.514446] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.hashclient_retry_delay = 1.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.514599] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.memcache_dead_retry = 300 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.514758] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.memcache_password = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.514914] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.515083] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.515235] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.memcache_pool_maxsize = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.515390] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.515543] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.memcache_sasl_enabled = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.515712] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.515870] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.memcache_socket_timeout = 1.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.516038] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.memcache_username = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.516199] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.proxies = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.516650] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.retry_attempts = 2 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.516650] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.retry_delay = 0.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.516650] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.socket_keepalive_count = 1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.516797] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.socket_keepalive_idle = 1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.516946] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.socket_keepalive_interval = 1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.517108] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.tls_allowed_ciphers = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.517259] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.tls_cafile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.517404] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.tls_certfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.517554] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.tls_enabled = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.517701] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cache.tls_keyfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.517862] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cinder.auth_section = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.518046] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cinder.auth_type = password {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.518210] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cinder.cafile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.518378] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cinder.catalog_info = volumev3::publicURL {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.518528] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cinder.certfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.518683] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cinder.collect_timing = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.518836] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cinder.cross_az_attach = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.518987] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cinder.debug = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.519149] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cinder.endpoint_template = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.519323] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cinder.http_retries = 3 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.519455] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cinder.insecure = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.519605] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cinder.keyfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.519767] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cinder.os_region_name = RegionOne {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.519953] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cinder.split_loggers = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.520089] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cinder.timeout = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.520257] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.520411] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] compute.cpu_dedicated_set = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.520562] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] compute.cpu_shared_set = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.520718] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] compute.image_type_exclude_list = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.520875] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.521045] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] compute.max_concurrent_disk_ops = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.521205] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] compute.max_disk_devices_to_attach = -1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.521361] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.521524] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.521681] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] compute.resource_provider_association_refresh = 300 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.521835] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] compute.shutdown_retry_interval = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.522022] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.522194] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] conductor.workers = 2 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.522362] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] console.allowed_origins = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.522515] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] console.ssl_ciphers = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.522687] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] console.ssl_minimum_version = default {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.522854] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] consoleauth.token_ttl = 600 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.523035] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.cafile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.523192] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.certfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.523350] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.collect_timing = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.523503] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.connect_retries = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.523667] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.connect_retry_delay = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.523811] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.endpoint_override = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.523967] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.insecure = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.524129] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.keyfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.524282] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.max_version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.524435] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.min_version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.524584] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.region_name = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.524736] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.service_name = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.524899] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.service_type = accelerator {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.525066] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.split_loggers = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.525221] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.status_code_retries = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.525373] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.status_code_retry_delay = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.525528] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.timeout = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.525705] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.525861] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] cyborg.version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.526047] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.backend = sqlalchemy {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.526225] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.connection = **** {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.526391] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.connection_debug = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.526554] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.connection_parameters = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.526713] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.connection_recycle_time = 3600 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.526876] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.connection_trace = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.527042] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.db_inc_retry_interval = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.527204] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.db_max_retries = 20 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.527361] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.db_max_retry_interval = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.527518] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.db_retry_interval = 1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.527679] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.max_overflow = 50 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.527836] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.max_pool_size = 5 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.527996] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.max_retries = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.528166] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.mysql_enable_ndb = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.528329] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.528483] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.mysql_wsrep_sync_wait = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.528641] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.pool_timeout = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.528807] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.retry_interval = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.528962] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.slave_connection = **** {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.529133] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.sqlite_synchronous = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.529289] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] database.use_db_reconnect = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.529462] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.backend = sqlalchemy {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.529635] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.connection = **** {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.529801] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.connection_debug = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.529970] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.connection_parameters = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.530142] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.connection_recycle_time = 3600 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.530302] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.connection_trace = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.530460] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.db_inc_retry_interval = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.530618] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.db_max_retries = 20 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.530775] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.db_max_retry_interval = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.530930] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.db_retry_interval = 1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.531104] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.max_overflow = 50 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.531265] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.max_pool_size = 5 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.531429] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.max_retries = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.531585] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.mysql_enable_ndb = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.531751] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.531907] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.532076] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.pool_timeout = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.532522] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.retry_interval = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.532702] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.slave_connection = **** {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.532875] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] api_database.sqlite_synchronous = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.533060] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] devices.enabled_mdev_types = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.533239] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.533403] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ephemeral_storage_encryption.enabled = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.533564] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.533735] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.api_servers = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.533894] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.cafile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.534062] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.certfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.534255] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.collect_timing = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.534374] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.connect_retries = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.534531] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.connect_retry_delay = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.534691] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.debug = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.534851] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.default_trusted_certificate_ids = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.535032] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.enable_certificate_validation = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.535198] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.enable_rbd_download = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.535351] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.endpoint_override = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.535510] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.insecure = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.535667] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.keyfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.535821] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.max_version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.535972] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.min_version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.536143] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.num_retries = 3 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.536305] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.rbd_ceph_conf = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.536464] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.rbd_connect_timeout = 5 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.536627] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.rbd_pool = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.536791] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.rbd_user = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.536947] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.region_name = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.537109] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.service_name = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.537274] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.service_type = image {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.537431] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.split_loggers = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.537582] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.status_code_retries = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.537738] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.status_code_retry_delay = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.537890] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.timeout = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.538083] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.538248] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.verify_glance_signatures = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.538401] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] glance.version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.538565] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] guestfs.debug = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.538731] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] hyperv.config_drive_cdrom = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.538892] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] hyperv.config_drive_inject_password = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.539067] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.539230] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] hyperv.enable_instance_metrics_collection = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.539395] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] hyperv.enable_remotefx = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.539561] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] hyperv.instances_path_share = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.539722] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] hyperv.iscsi_initiator_list = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.539880] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] hyperv.limit_cpu_features = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.540049] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.540210] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.540373] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] hyperv.power_state_check_timeframe = 60 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.540531] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.540695] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.540862] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] hyperv.use_multipath_io = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.541026] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] hyperv.volume_attach_retry_count = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.541185] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.541343] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] hyperv.vswitch_name = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.541501] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.541666] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] mks.enabled = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.542028] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.542213] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] image_cache.manager_interval = 2400 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.542380] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] image_cache.precache_concurrency = 1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.542545] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] image_cache.remove_unused_base_images = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.542713] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.542876] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.543053] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] image_cache.subdirectory_name = _base {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.543229] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.api_max_retries = 60 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.543390] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.api_retry_interval = 2 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.543545] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.auth_section = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.543703] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.auth_type = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.543856] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.cafile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.544015] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.certfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.544176] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.collect_timing = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.544330] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.connect_retries = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.544489] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.connect_retry_delay = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.544642] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.endpoint_override = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.544800] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.insecure = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.544951] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.keyfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.546278] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.max_version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.546278] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.min_version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.546278] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.partition_key = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.546278] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.peer_list = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.546278] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.region_name = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.546278] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.serial_console_state_timeout = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.546278] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.service_name = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.546515] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.service_type = baremetal {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.546515] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.split_loggers = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.546515] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.status_code_retries = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.546590] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.status_code_retry_delay = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.546773] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.timeout = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.546909] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.547080] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ironic.version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.547254] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.547420] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] key_manager.fixed_key = **** {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.547591] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.547754] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican.barbican_api_version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.547903] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican.barbican_endpoint = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.548206] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican.barbican_endpoint_type = public {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.548263] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican.barbican_region_name = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.548371] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican.cafile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.548516] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican.certfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.549456] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican.collect_timing = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.549456] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican.insecure = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.549456] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican.keyfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.549456] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican.number_of_retries = 60 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.549456] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican.retry_delay = 1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.549456] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican.send_service_user_token = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.549668] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican.split_loggers = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.549709] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican.timeout = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.550017] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican.verify_ssl = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.550017] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican.verify_ssl_path = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.550563] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican_service_user.auth_section = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.550563] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican_service_user.auth_type = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.550563] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican_service_user.cafile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.550703] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican_service_user.certfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.550748] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican_service_user.collect_timing = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.550887] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican_service_user.insecure = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.551043] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican_service_user.keyfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.551204] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican_service_user.split_loggers = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.551356] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] barbican_service_user.timeout = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.551516] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vault.approle_role_id = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.551668] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vault.approle_secret_id = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.551821] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vault.cafile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.551972] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vault.certfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.552141] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vault.collect_timing = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.552297] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vault.insecure = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.552448] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vault.keyfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.552640] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vault.kv_mountpoint = secret {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.552784] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vault.kv_version = 2 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.552938] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vault.namespace = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.553099] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vault.root_token_id = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.553256] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vault.split_loggers = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.553401] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vault.ssl_ca_crt_file = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.553675] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vault.timeout = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.553761] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vault.use_ssl = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.553858] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.554027] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.cafile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.554715] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.certfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.554715] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.collect_timing = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.554715] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.connect_retries = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.554715] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.connect_retry_delay = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558225] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.endpoint_override = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558225] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.insecure = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558225] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.keyfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558225] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.max_version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558225] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.min_version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558225] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.region_name = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558225] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.service_name = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558413] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.service_type = identity {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558413] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.split_loggers = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558413] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.status_code_retries = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558413] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.status_code_retry_delay = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558413] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.timeout = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558413] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558413] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] keystone.version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558584] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.connection_uri = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558584] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.cpu_mode = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558584] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.cpu_model_extra_flags = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558584] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.cpu_models = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558584] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.cpu_power_governor_high = performance {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558584] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.cpu_power_governor_low = powersave {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558584] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.cpu_power_management = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558758] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558758] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.device_detach_attempts = 8 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558758] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.device_detach_timeout = 20 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558758] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.disk_cachemodes = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558758] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.disk_prefix = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558758] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.enabled_perf_events = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.558919] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.file_backed_memory = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.559227] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.gid_maps = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.559227] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.hw_disk_discard = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.559352] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.hw_machine_type = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.559448] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.images_rbd_ceph_conf = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.559611] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.559774] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.560040] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.images_rbd_glance_store_name = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.560110] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.images_rbd_pool = rbd {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.560266] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.images_type = default {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.560423] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.images_volume_group = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.560575] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.inject_key = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.560739] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.inject_partition = -2 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.560893] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.inject_password = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.561057] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.iscsi_iface = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.561214] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.iser_use_multipath = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.561371] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.live_migration_bandwidth = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.561528] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.561684] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.live_migration_downtime = 500 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.561841] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.561995] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.562158] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.live_migration_inbound_addr = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.562314] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.562467] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.live_migration_permit_post_copy = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.562645] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.live_migration_scheme = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.562792] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.live_migration_timeout_action = abort {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.562954] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.live_migration_tunnelled = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.563121] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.live_migration_uri = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.563280] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.live_migration_with_native_tls = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.563435] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.max_queues = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.563620] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.563751] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.nfs_mount_options = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.564068] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.564243] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.564406] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.num_iser_scan_tries = 5 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.564564] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.num_memory_encrypted_guests = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.564724] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.564885] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.num_pcie_ports = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.565056] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.num_volume_scan_tries = 5 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.565221] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.pmem_namespaces = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.565375] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.quobyte_client_cfg = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.565654] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.565824] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.rbd_connect_timeout = 5 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.565985] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.566157] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.566313] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.rbd_secret_uuid = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.566466] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.rbd_user = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.566625] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.566795] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.remote_filesystem_transport = ssh {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.566949] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.rescue_image_id = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.567113] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.rescue_kernel_id = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.567267] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.rescue_ramdisk_id = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.567430] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.567583] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.rx_queue_size = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.567745] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.smbfs_mount_options = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.568017] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.568187] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.snapshot_compression = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.568345] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.snapshot_image_format = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.568556] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.568719] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.sparse_logical_volumes = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.568882] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.swtpm_enabled = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.569055] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.swtpm_group = tss {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.569221] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.swtpm_user = tss {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.569386] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.sysinfo_serial = unique {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.569542] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.tx_queue_size = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.569700] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.uid_maps = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.569860] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.use_virtio_for_bridges = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.570033] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.virt_type = kvm {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.570203] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.volume_clear = zero {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.570366] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.volume_clear_size = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.570527] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.volume_use_multipath = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.570682] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.vzstorage_cache_path = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.570847] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.571028] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.vzstorage_mount_group = qemu {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.571196] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.vzstorage_mount_opts = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.571363] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.571634] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.571807] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.vzstorage_mount_user = stack {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.571964] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.572141] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.auth_section = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.572308] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.auth_type = password {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.572464] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.cafile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.572628] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.certfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.572768] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.collect_timing = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.572920] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.connect_retries = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.573083] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.connect_retry_delay = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.573248] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.default_floating_pool = public {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.573402] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.endpoint_override = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.573560] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.extension_sync_interval = 600 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.573716] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.http_retries = 3 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.573874] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.insecure = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.574037] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.keyfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.574194] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.max_version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.574360] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.574801] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.min_version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.574801] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.ovs_bridge = br-int {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.574884] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.physnets = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.574986] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.region_name = RegionOne {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.575163] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.service_metadata_proxy = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.575320] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.service_name = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.575484] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.service_type = network {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.575640] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.split_loggers = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.575795] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.status_code_retries = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.575947] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.status_code_retry_delay = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.576112] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.timeout = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.576287] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.576443] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] neutron.version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.576608] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] notifications.bdms_in_notifications = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.576780] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] notifications.default_level = INFO {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.576946] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] notifications.notification_format = unversioned {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.577124] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] notifications.notify_on_state_change = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.577298] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.577471] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] pci.alias = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.577646] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] pci.device_spec = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.577812] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] pci.report_in_placement = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.577984] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.auth_section = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.578167] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.auth_type = password {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.578332] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.578488] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.cafile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.578643] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.certfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.578803] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.collect_timing = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.578958] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.connect_retries = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.579126] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.connect_retry_delay = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.579281] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.default_domain_id = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.579434] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.default_domain_name = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.579586] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.domain_id = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.579737] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.domain_name = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.579890] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.endpoint_override = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.580054] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.insecure = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.580210] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.keyfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.580389] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.max_version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.580507] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.min_version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.580667] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.password = **** {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.580821] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.project_domain_id = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.580981] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.project_domain_name = Default {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.581153] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.project_id = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.581322] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.project_name = service {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.581485] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.region_name = RegionOne {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.581640] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.service_name = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.581805] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.service_type = placement {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.581963] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.split_loggers = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.582131] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.status_code_retries = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.582286] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.status_code_retry_delay = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.582439] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.system_scope = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.582593] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.timeout = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.582746] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.trust_id = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.582897] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.user_domain_id = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.583083] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.user_domain_name = Default {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.583245] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.user_id = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.583413] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.username = placement {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.583593] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.583747] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] placement.version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.583920] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] quota.cores = 20 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.584090] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] quota.count_usage_from_placement = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.584258] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.584428] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] quota.injected_file_content_bytes = 10240 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.584590] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] quota.injected_file_path_length = 255 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.584751] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] quota.injected_files = 5 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.584912] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] quota.instances = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.585086] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] quota.key_pairs = 100 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.585250] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] quota.metadata_items = 128 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.585410] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] quota.ram = 51200 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.585569] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] quota.recheck_quota = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.585831] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] quota.server_group_members = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.585894] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] quota.server_groups = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.586055] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] rdp.enabled = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.586368] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.586548] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.586712] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.586872] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] scheduler.image_metadata_prefilter = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.587042] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.587219] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] scheduler.max_attempts = 3 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.587383] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] scheduler.max_placement_results = 1000 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.587545] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.587703] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] scheduler.query_placement_for_availability_zone = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.587861] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] scheduler.query_placement_for_image_type_support = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.588025] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.588199] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] scheduler.workers = 2 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.588371] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.588539] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.588716] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.588883] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.589059] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.589222] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.589382] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.589567] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.589732] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.host_subset_size = 1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.589890] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.590060] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.590224] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.isolated_hosts = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.590384] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.isolated_images = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.590543] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.590700] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.590859] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.pci_in_placement = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.591024] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.591186] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.591344] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.591498] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.591656] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.591818] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.591977] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.track_instance_changes = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.592159] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.592324] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] metrics.required = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.592482] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] metrics.weight_multiplier = 1.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.592654] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.592806] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] metrics.weight_setting = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.593117] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.593292] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] serial_console.enabled = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.593616] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] serial_console.port_range = 10000:20000 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.593616] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.593785] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.593949] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] serial_console.serialproxy_port = 6083 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.594117] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] service_user.auth_section = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.594285] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] service_user.auth_type = password {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.594439] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] service_user.cafile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.594591] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] service_user.certfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.594751] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] service_user.collect_timing = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.594907] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] service_user.insecure = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.595083] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] service_user.keyfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.595255] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] service_user.send_service_user_token = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.595415] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] service_user.split_loggers = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.595570] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] service_user.timeout = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.595740] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] spice.agent_enabled = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.595911] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] spice.enabled = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.596314] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.596403] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.596573] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] spice.html5proxy_port = 6082 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.596731] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] spice.image_compression = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.596897] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] spice.jpeg_compression = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.597062] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] spice.playback_compression = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.597232] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] spice.server_listen = 127.0.0.1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.597397] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.597553] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] spice.streaming_mode = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.597705] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] spice.zlib_compression = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.597867] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] upgrade_levels.baseapi = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.598025] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] upgrade_levels.cert = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.598191] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] upgrade_levels.compute = auto {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.598345] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] upgrade_levels.conductor = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.598505] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] upgrade_levels.scheduler = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.598668] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vendordata_dynamic_auth.auth_section = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.599734] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vendordata_dynamic_auth.auth_type = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.599734] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vendordata_dynamic_auth.cafile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.599734] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vendordata_dynamic_auth.certfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.599734] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.599734] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vendordata_dynamic_auth.insecure = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.599734] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vendordata_dynamic_auth.keyfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.599734] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.599932] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vendordata_dynamic_auth.timeout = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.600097] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.api_retry_count = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.600185] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.ca_file = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.600348] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.cache_prefix = devstack-image-cache {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.600505] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.cluster_name = testcl1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.600662] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.connection_pool_size = 10 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.600813] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.console_delay_seconds = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.600972] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.datastore_regex = ^datastore.* {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.602917] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.602917] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.host_password = **** {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.602917] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.host_port = 443 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.602917] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.host_username = administrator@vsphere.local {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.602917] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.insecure = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.602917] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.integration_bridge = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.603135] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.maximum_objects = 100 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.603135] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.pbm_default_policy = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.603135] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.pbm_enabled = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.603135] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.pbm_wsdl_location = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.603135] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.603135] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.serial_port_proxy_uri = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.603135] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.serial_port_service_uri = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.603311] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.task_poll_interval = 0.5 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.603438] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.use_linked_clone = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.603512] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.vnc_keymap = en-us {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.603668] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.vnc_port = 5900 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.603830] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vmware.vnc_port_total = 10000 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.604025] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vnc.auth_schemes = ['none'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.604198] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vnc.enabled = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.604490] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.604675] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.604843] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vnc.novncproxy_port = 6080 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.605027] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vnc.server_listen = 127.0.0.1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.605204] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.605363] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vnc.vencrypt_ca_certs = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.605517] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vnc.vencrypt_client_cert = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.605672] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vnc.vencrypt_client_key = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.605848] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.606014] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.606173] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.606329] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.606485] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.disable_rootwrap = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.606639] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.enable_numa_live_migration = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.606798] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.606961] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.607138] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.607296] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.libvirt_disable_apic = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.607451] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.607606] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.607767] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.607921] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.608085] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.608245] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.608401] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.608555] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.608711] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.608871] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.609063] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.609235] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] wsgi.client_socket_timeout = 900 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.609399] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] wsgi.default_pool_size = 1000 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.609564] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] wsgi.keep_alive = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.609729] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] wsgi.max_header_line = 16384 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.609889] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] wsgi.secure_proxy_ssl_header = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.610054] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] wsgi.ssl_ca_file = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.610211] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] wsgi.ssl_cert_file = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.610366] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] wsgi.ssl_key_file = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.610528] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] wsgi.tcp_keepidle = 600 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.610718] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.610854] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] zvm.ca_file = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.611023] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] zvm.cloud_connector_url = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.611299] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.611529] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] zvm.reachable_timeout = 300 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.611641] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_policy.enforce_new_defaults = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.611811] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_policy.enforce_scope = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.611984] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_policy.policy_default_rule = default {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.612175] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.612344] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_policy.policy_file = policy.yaml {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.612512] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.612669] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.612821] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.612976] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.613143] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.613314] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.613487] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.613659] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] profiler.connection_string = messaging:// {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.613822] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] profiler.enabled = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.613985] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] profiler.es_doc_type = notification {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.614167] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] profiler.es_scroll_size = 10000 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.614317] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] profiler.es_scroll_time = 2m {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.614473] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] profiler.filter_error_trace = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.614637] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] profiler.hmac_keys = SECRET_KEY {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.614794] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] profiler.sentinel_service_name = mymaster {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.614963] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] profiler.socket_timeout = 0.1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.615132] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] profiler.trace_sqlalchemy = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.615293] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] remote_debug.host = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.615449] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] remote_debug.port = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.615623] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.615785] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.615945] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.616116] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.616274] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.616436] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.616592] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.616751] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.616911] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.617076] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.617243] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.617403] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.617566] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.617727] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.617887] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.618062] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.618223] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.618380] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.618539] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.618700] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.618845] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.619020] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.619190] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.619349] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.619510] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.619671] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.ssl = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.619838] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.620012] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.620171] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.620336] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.620504] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_rabbit.ssl_version = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.620685] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.620854] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_notifications.retry = -1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.621041] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.621217] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_messaging_notifications.transport_url = **** {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.621385] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.auth_section = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.621546] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.auth_type = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.621700] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.cafile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.621853] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.certfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.622016] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.collect_timing = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.622173] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.connect_retries = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.622328] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.connect_retry_delay = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.622480] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.endpoint_id = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.622639] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.endpoint_override = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.622797] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.insecure = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.622948] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.keyfile = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.623110] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.max_version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.623263] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.min_version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.623416] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.region_name = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.623567] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.service_name = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.623730] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.service_type = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.623899] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.split_loggers = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.624018] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.status_code_retries = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.624172] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.status_code_retry_delay = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.624320] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.timeout = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.624468] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.valid_interfaces = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.624615] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_limit.version = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.624770] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_reports.file_event_handler = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.624923] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.625602] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] oslo_reports.log_dir = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.625602] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.625602] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.625602] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.625841] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.625891] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629019] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629019] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629019] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vif_plug_ovs_privileged.group = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629019] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629019] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629019] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629409] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] vif_plug_ovs_privileged.user = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629409] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] os_vif_linux_bridge.flat_interface = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629409] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629409] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629409] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629409] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629558] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629558] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629558] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629558] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] os_vif_ovs.isolate_vif = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629558] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629558] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629558] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629730] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] os_vif_ovs.ovsdb_interface = native {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629730] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] os_vif_ovs.per_port_bridge = False {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629730] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] os_brick.lock_path = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629730] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] privsep_osbrick.capabilities = [21] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629730] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] privsep_osbrick.group = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629730] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] privsep_osbrick.helper_command = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.629891] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.630150] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.630226] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] privsep_osbrick.user = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.630358] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.630509] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] nova_sys_admin.group = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.630658] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] nova_sys_admin.helper_command = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.630813] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.630970] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.631771] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] nova_sys_admin.user = None {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 537.631771] env[60722]: DEBUG oslo_service.service [None req-c1216a20-12fc-4187-ad27-c215b51177c2 None None] ******************************************************************************** {{(pid=60722) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 537.631771] env[60722]: INFO nova.service [-] Starting compute node (version 0.1.0) [ 537.642039] env[60722]: INFO nova.virt.node [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Generated node identity 6d7f336b-9351-4171-8197-866cdafbab42 [ 537.642039] env[60722]: INFO nova.virt.node [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Wrote node identity 6d7f336b-9351-4171-8197-866cdafbab42 to /opt/stack/data/n-cpu-1/compute_id [ 537.653311] env[60722]: WARNING nova.compute.manager [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Compute nodes ['6d7f336b-9351-4171-8197-866cdafbab42'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 537.684734] env[60722]: INFO nova.compute.manager [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 537.706916] env[60722]: WARNING nova.compute.manager [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 537.707157] env[60722]: DEBUG oslo_concurrency.lockutils [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 537.707358] env[60722]: DEBUG oslo_concurrency.lockutils [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 537.707492] env[60722]: DEBUG oslo_concurrency.lockutils [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 537.707638] env[60722]: DEBUG nova.compute.resource_tracker [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60722) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 537.708784] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0883bcb8-69cf-42c3-9a69-1773543bcf2a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.717542] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ad4ace3-5596-42e6-bc1c-fcbd488d1f58 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.731497] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a943d3f9-17cc-4a52-88fc-ec3751459083 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.737761] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb215bfc-0854-4f78-9385-7471fd28ceb2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.766345] env[60722]: DEBUG nova.compute.resource_tracker [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181716MB free_disk=100GB free_vcpus=48 pci_devices=None {{(pid=60722) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 537.766486] env[60722]: DEBUG oslo_concurrency.lockutils [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 537.766652] env[60722]: DEBUG oslo_concurrency.lockutils [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 537.778713] env[60722]: WARNING nova.compute.resource_tracker [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] No compute node record for cpu-1:6d7f336b-9351-4171-8197-866cdafbab42: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 6d7f336b-9351-4171-8197-866cdafbab42 could not be found. [ 537.790828] env[60722]: INFO nova.compute.resource_tracker [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 6d7f336b-9351-4171-8197-866cdafbab42 [ 537.837100] env[60722]: DEBUG nova.compute.resource_tracker [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 537.837219] env[60722]: DEBUG nova.compute.resource_tracker [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 537.936338] env[60722]: INFO nova.scheduler.client.report [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] [req-d90e1358-133d-44c4-a3f3-8e1e3ae92d4a] Created resource provider record via placement API for resource provider with UUID 6d7f336b-9351-4171-8197-866cdafbab42 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 537.951648] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b19652dd-7a38-4a27-b83b-8e8574453a21 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.961994] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed44dded-821b-41b8-ac91-34899fa7902f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.991044] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae42f67e-cd83-490f-982e-17d95cc1f45c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 537.997620] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d16b2ad1-09a4-48bc-a188-cd92f9abe49b {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 538.010047] env[60722]: DEBUG nova.compute.provider_tree [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Updating inventory in ProviderTree for provider 6d7f336b-9351-4171-8197-866cdafbab42 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 538.044203] env[60722]: DEBUG nova.scheduler.client.report [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Updated inventory for provider 6d7f336b-9351-4171-8197-866cdafbab42 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 538.044421] env[60722]: DEBUG nova.compute.provider_tree [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Updating resource provider 6d7f336b-9351-4171-8197-866cdafbab42 generation from 0 to 1 during operation: update_inventory {{(pid=60722) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 538.044557] env[60722]: DEBUG nova.compute.provider_tree [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Updating inventory in ProviderTree for provider 6d7f336b-9351-4171-8197-866cdafbab42 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 538.083703] env[60722]: DEBUG nova.compute.provider_tree [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Updating resource provider 6d7f336b-9351-4171-8197-866cdafbab42 generation from 1 to 2 during operation: update_traits {{(pid=60722) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 538.100031] env[60722]: DEBUG nova.compute.resource_tracker [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60722) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 538.100180] env[60722]: DEBUG oslo_concurrency.lockutils [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.333s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 538.100338] env[60722]: DEBUG nova.service [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Creating RPC server for service compute {{(pid=60722) start /opt/stack/nova/nova/service.py:182}} [ 538.115815] env[60722]: DEBUG nova.service [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] Join ServiceGroup membership for this service compute {{(pid=60722) start /opt/stack/nova/nova/service.py:199}} [ 538.116011] env[60722]: DEBUG nova.servicegroup.drivers.db [None req-70ec0153-f0b6-4a4d-9430-a8e2341a5344 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=60722) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 569.771263] env[60722]: DEBUG oslo_concurrency.lockutils [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Acquiring lock "dd2a3121-462e-4bd7-b238-790341617abf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 569.771541] env[60722]: DEBUG oslo_concurrency.lockutils [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Lock "dd2a3121-462e-4bd7-b238-790341617abf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 569.791555] env[60722]: DEBUG nova.compute.manager [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 569.888845] env[60722]: DEBUG oslo_concurrency.lockutils [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 569.889112] env[60722]: DEBUG oslo_concurrency.lockutils [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 569.890634] env[60722]: INFO nova.compute.claims [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 570.037379] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d76a8e1-f2d2-41ad-a041-c881ae19652a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.047856] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c21c1779-b403-475e-aa2f-5ee7ccaa66af {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.084213] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74e78c72-6d7e-4a7b-8105-123e5566f947 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.091426] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28e1914b-a41d-4093-bc18-8e931f0378a6 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.105388] env[60722]: DEBUG nova.compute.provider_tree [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 570.118030] env[60722]: DEBUG nova.scheduler.client.report [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 570.147381] env[60722]: DEBUG oslo_concurrency.lockutils [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.257s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 570.147381] env[60722]: DEBUG nova.compute.manager [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 570.189804] env[60722]: DEBUG nova.compute.utils [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 570.194176] env[60722]: DEBUG nova.compute.manager [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 570.194176] env[60722]: DEBUG nova.network.neutron [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 570.209643] env[60722]: DEBUG nova.compute.manager [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 570.292573] env[60722]: DEBUG nova.compute.manager [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 571.980559] env[60722]: DEBUG nova.virt.hardware [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 571.981625] env[60722]: DEBUG nova.virt.hardware [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 571.982362] env[60722]: DEBUG nova.virt.hardware [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 571.983256] env[60722]: DEBUG nova.virt.hardware [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 571.983256] env[60722]: DEBUG nova.virt.hardware [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 571.983256] env[60722]: DEBUG nova.virt.hardware [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 571.983990] env[60722]: DEBUG nova.virt.hardware [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 571.983990] env[60722]: DEBUG nova.virt.hardware [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 571.984312] env[60722]: DEBUG nova.virt.hardware [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 571.985029] env[60722]: DEBUG nova.virt.hardware [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 571.985029] env[60722]: DEBUG nova.virt.hardware [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 571.985855] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d816ce84-dc67-4dc4-a923-d422c2f31c25 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.006269] env[60722]: DEBUG nova.policy [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'be310bc41ee4494b9f51819cc04fcdbd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'caeb255a71a44bf5b42254b34490c11c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 572.009293] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1d1c5f1-48ea-4c91-b6cd-87b0911ebc45 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.027049] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df97f7aa-9669-4e43-bd1e-1cde99d172ea {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.501423] env[60722]: DEBUG nova.network.neutron [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Successfully created port: cf5e5966-49cf-4da1-8cc0-2311e60d2822 {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 575.006686] env[60722]: ERROR nova.compute.manager [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port cf5e5966-49cf-4da1-8cc0-2311e60d2822, please check neutron logs for more information. [ 575.006686] env[60722]: ERROR nova.compute.manager Traceback (most recent call last): [ 575.006686] env[60722]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 575.006686] env[60722]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 575.006686] env[60722]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 575.006686] env[60722]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 575.006686] env[60722]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 575.006686] env[60722]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 575.006686] env[60722]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 575.006686] env[60722]: ERROR nova.compute.manager self.force_reraise() [ 575.006686] env[60722]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 575.006686] env[60722]: ERROR nova.compute.manager raise self.value [ 575.006686] env[60722]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 575.006686] env[60722]: ERROR nova.compute.manager updated_port = self._update_port( [ 575.006686] env[60722]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 575.006686] env[60722]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 575.007348] env[60722]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 575.007348] env[60722]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 575.007348] env[60722]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port cf5e5966-49cf-4da1-8cc0-2311e60d2822, please check neutron logs for more information. [ 575.007348] env[60722]: ERROR nova.compute.manager [ 575.007348] env[60722]: Traceback (most recent call last): [ 575.007348] env[60722]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 575.007348] env[60722]: listener.cb(fileno) [ 575.007348] env[60722]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 575.007348] env[60722]: result = function(*args, **kwargs) [ 575.007348] env[60722]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 575.007348] env[60722]: return func(*args, **kwargs) [ 575.007348] env[60722]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 575.007348] env[60722]: raise e [ 575.007348] env[60722]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 575.007348] env[60722]: nwinfo = self.network_api.allocate_for_instance( [ 575.007348] env[60722]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 575.007348] env[60722]: created_port_ids = self._update_ports_for_instance( [ 575.007348] env[60722]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 575.007348] env[60722]: with excutils.save_and_reraise_exception(): [ 575.007348] env[60722]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 575.007348] env[60722]: self.force_reraise() [ 575.007348] env[60722]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 575.007348] env[60722]: raise self.value [ 575.007348] env[60722]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 575.007348] env[60722]: updated_port = self._update_port( [ 575.007348] env[60722]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 575.007348] env[60722]: _ensure_no_port_binding_failure(port) [ 575.007348] env[60722]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 575.007348] env[60722]: raise exception.PortBindingFailed(port_id=port['id']) [ 575.008173] env[60722]: nova.exception.PortBindingFailed: Binding failed for port cf5e5966-49cf-4da1-8cc0-2311e60d2822, please check neutron logs for more information. [ 575.008173] env[60722]: Removing descriptor: 11 [ 575.008481] env[60722]: ERROR nova.compute.manager [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port cf5e5966-49cf-4da1-8cc0-2311e60d2822, please check neutron logs for more information. [ 575.008481] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] Traceback (most recent call last): [ 575.008481] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 575.008481] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] yield resources [ 575.008481] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 575.008481] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] self.driver.spawn(context, instance, image_meta, [ 575.008481] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 575.008481] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 575.008481] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 575.008481] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] vm_ref = self.build_virtual_machine(instance, [ 575.008481] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 575.008749] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] vif_infos = vmwarevif.get_vif_info(self._session, [ 575.008749] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 575.008749] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] for vif in network_info: [ 575.008749] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 575.008749] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] return self._sync_wrapper(fn, *args, **kwargs) [ 575.008749] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 575.008749] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] self.wait() [ 575.008749] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 575.008749] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] self[:] = self._gt.wait() [ 575.008749] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 575.008749] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] return self._exit_event.wait() [ 575.008749] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 575.008749] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] result = hub.switch() [ 575.008749] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 575.009067] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] return self.greenlet.switch() [ 575.009067] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 575.009067] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] result = function(*args, **kwargs) [ 575.009067] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 575.009067] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] return func(*args, **kwargs) [ 575.009067] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 575.009067] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] raise e [ 575.009067] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 575.009067] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] nwinfo = self.network_api.allocate_for_instance( [ 575.009067] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 575.009067] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] created_port_ids = self._update_ports_for_instance( [ 575.009067] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 575.009067] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] with excutils.save_and_reraise_exception(): [ 575.009374] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 575.009374] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] self.force_reraise() [ 575.009374] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 575.009374] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] raise self.value [ 575.009374] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 575.009374] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] updated_port = self._update_port( [ 575.009374] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 575.009374] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] _ensure_no_port_binding_failure(port) [ 575.009374] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 575.009374] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] raise exception.PortBindingFailed(port_id=port['id']) [ 575.009374] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] nova.exception.PortBindingFailed: Binding failed for port cf5e5966-49cf-4da1-8cc0-2311e60d2822, please check neutron logs for more information. [ 575.009374] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] [ 575.009659] env[60722]: INFO nova.compute.manager [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Terminating instance [ 575.011671] env[60722]: DEBUG oslo_concurrency.lockutils [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Acquiring lock "refresh_cache-dd2a3121-462e-4bd7-b238-790341617abf" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 575.011832] env[60722]: DEBUG oslo_concurrency.lockutils [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Acquired lock "refresh_cache-dd2a3121-462e-4bd7-b238-790341617abf" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 575.011991] env[60722]: DEBUG nova.network.neutron [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 575.058551] env[60722]: DEBUG nova.network.neutron [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 575.159631] env[60722]: DEBUG nova.network.neutron [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 575.192076] env[60722]: DEBUG oslo_concurrency.lockutils [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Releasing lock "refresh_cache-dd2a3121-462e-4bd7-b238-790341617abf" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 575.192355] env[60722]: DEBUG nova.compute.manager [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 575.192563] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 575.193323] env[60722]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ccbc1aa2-e09c-4a87-99f2-7d30d0b5405f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.233685] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd732479-c34d-4118-b152-4021826de6a4 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.257030] env[60722]: WARNING nova.virt.vmwareapi.vmops [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance dd2a3121-462e-4bd7-b238-790341617abf could not be found. [ 575.257219] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 575.257525] env[60722]: INFO nova.compute.manager [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Took 0.06 seconds to destroy the instance on the hypervisor. [ 575.257784] env[60722]: DEBUG oslo.service.loopingcall [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 575.257973] env[60722]: DEBUG nova.compute.manager [-] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 575.258077] env[60722]: DEBUG nova.network.neutron [-] [instance: dd2a3121-462e-4bd7-b238-790341617abf] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 575.286309] env[60722]: DEBUG nova.network.neutron [-] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 575.298855] env[60722]: DEBUG nova.network.neutron [-] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 575.310081] env[60722]: INFO nova.compute.manager [-] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Took 0.05 seconds to deallocate network for instance. [ 575.312021] env[60722]: DEBUG nova.compute.claims [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 575.312174] env[60722]: DEBUG oslo_concurrency.lockutils [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 575.313473] env[60722]: DEBUG oslo_concurrency.lockutils [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 575.392792] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72273493-6fd8-4b22-af4e-d52f43127cc0 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.404047] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-980f89e9-9d78-440e-b3a2-5d046dab1827 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.439384] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8543b2dd-f4e1-4bea-9bee-eb9c5941dcb6 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.447767] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02814549-504c-4ca9-bf97-36bfb5e8ab65 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.467703] env[60722]: DEBUG nova.compute.provider_tree [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 575.479774] env[60722]: DEBUG nova.scheduler.client.report [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 575.502399] env[60722]: DEBUG oslo_concurrency.lockutils [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.189s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 575.502654] env[60722]: ERROR nova.compute.manager [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port cf5e5966-49cf-4da1-8cc0-2311e60d2822, please check neutron logs for more information. [ 575.502654] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] Traceback (most recent call last): [ 575.502654] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 575.502654] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] self.driver.spawn(context, instance, image_meta, [ 575.502654] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 575.502654] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] self._vmops.spawn(context, instance, image_meta, injected_files, [ 575.502654] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 575.502654] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] vm_ref = self.build_virtual_machine(instance, [ 575.502654] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 575.502654] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] vif_infos = vmwarevif.get_vif_info(self._session, [ 575.502654] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 575.502946] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] for vif in network_info: [ 575.502946] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 575.502946] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] return self._sync_wrapper(fn, *args, **kwargs) [ 575.502946] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 575.502946] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] self.wait() [ 575.502946] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 575.502946] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] self[:] = self._gt.wait() [ 575.502946] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 575.502946] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] return self._exit_event.wait() [ 575.502946] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 575.502946] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] result = hub.switch() [ 575.502946] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 575.502946] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] return self.greenlet.switch() [ 575.502946] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 575.503281] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] result = function(*args, **kwargs) [ 575.503281] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 575.503281] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] return func(*args, **kwargs) [ 575.503281] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 575.503281] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] raise e [ 575.503281] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 575.503281] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] nwinfo = self.network_api.allocate_for_instance( [ 575.503281] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 575.503281] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] created_port_ids = self._update_ports_for_instance( [ 575.503281] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 575.503281] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] with excutils.save_and_reraise_exception(): [ 575.503281] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 575.503281] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] self.force_reraise() [ 575.503625] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 575.503625] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] raise self.value [ 575.503625] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 575.503625] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] updated_port = self._update_port( [ 575.503625] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 575.503625] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] _ensure_no_port_binding_failure(port) [ 575.503625] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 575.503625] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] raise exception.PortBindingFailed(port_id=port['id']) [ 575.503625] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] nova.exception.PortBindingFailed: Binding failed for port cf5e5966-49cf-4da1-8cc0-2311e60d2822, please check neutron logs for more information. [ 575.503625] env[60722]: ERROR nova.compute.manager [instance: dd2a3121-462e-4bd7-b238-790341617abf] [ 575.503625] env[60722]: DEBUG nova.compute.utils [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Binding failed for port cf5e5966-49cf-4da1-8cc0-2311e60d2822, please check neutron logs for more information. {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 575.510886] env[60722]: DEBUG nova.compute.manager [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Build of instance dd2a3121-462e-4bd7-b238-790341617abf was re-scheduled: Binding failed for port cf5e5966-49cf-4da1-8cc0-2311e60d2822, please check neutron logs for more information. {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 575.511822] env[60722]: DEBUG nova.compute.manager [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 575.511822] env[60722]: DEBUG oslo_concurrency.lockutils [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Acquiring lock "refresh_cache-dd2a3121-462e-4bd7-b238-790341617abf" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 575.511822] env[60722]: DEBUG oslo_concurrency.lockutils [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Acquired lock "refresh_cache-dd2a3121-462e-4bd7-b238-790341617abf" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 575.511822] env[60722]: DEBUG nova.network.neutron [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 575.547933] env[60722]: DEBUG nova.network.neutron [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 575.672520] env[60722]: DEBUG nova.network.neutron [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 575.687778] env[60722]: DEBUG oslo_concurrency.lockutils [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Releasing lock "refresh_cache-dd2a3121-462e-4bd7-b238-790341617abf" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 575.687945] env[60722]: DEBUG nova.compute.manager [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 575.687945] env[60722]: DEBUG nova.compute.manager [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 575.688241] env[60722]: DEBUG nova.network.neutron [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 575.757681] env[60722]: DEBUG nova.network.neutron [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 575.768293] env[60722]: DEBUG nova.network.neutron [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 575.782356] env[60722]: INFO nova.compute.manager [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] [instance: dd2a3121-462e-4bd7-b238-790341617abf] Took 0.09 seconds to deallocate network for instance. [ 575.899672] env[60722]: INFO nova.scheduler.client.report [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Deleted allocations for instance dd2a3121-462e-4bd7-b238-790341617abf [ 575.919708] env[60722]: DEBUG oslo_concurrency.lockutils [None req-deee4ea7-7872-4dae-9af4-101366ac265d tempest-ServerDiagnosticsTest-464680813 tempest-ServerDiagnosticsTest-464680813-project-member] Lock "dd2a3121-462e-4bd7-b238-790341617abf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.148s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 577.968932] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Acquiring lock "a2ad54e2-c14a-4548-aedd-01668745b397" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 577.969518] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Lock "a2ad54e2-c14a-4548-aedd-01668745b397" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 577.990564] env[60722]: DEBUG nova.compute.manager [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 578.061777] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 578.061777] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 578.061777] env[60722]: INFO nova.compute.claims [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 578.149021] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a0291ff-2888-4fb5-b9fa-c1dd3c42c97d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 578.158661] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0fef638-97e2-40be-aab7-a7af932478b1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 578.195254] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec706aed-7a1c-4220-a43d-3294c47e9f56 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 578.203354] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e873a2e6-8fbb-41aa-8320-06882de4f191 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 578.222250] env[60722]: DEBUG nova.compute.provider_tree [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 578.234869] env[60722]: DEBUG nova.scheduler.client.report [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 578.265285] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.205s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 578.266265] env[60722]: DEBUG nova.compute.manager [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 578.322875] env[60722]: DEBUG nova.compute.utils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 578.325575] env[60722]: DEBUG nova.compute.manager [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 578.326089] env[60722]: DEBUG nova.network.neutron [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 578.355062] env[60722]: DEBUG nova.compute.manager [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 578.473201] env[60722]: DEBUG nova.compute.manager [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 578.509266] env[60722]: DEBUG nova.virt.hardware [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 578.509481] env[60722]: DEBUG nova.virt.hardware [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 578.509634] env[60722]: DEBUG nova.virt.hardware [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 578.509805] env[60722]: DEBUG nova.virt.hardware [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 578.509947] env[60722]: DEBUG nova.virt.hardware [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 578.512109] env[60722]: DEBUG nova.virt.hardware [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 578.512365] env[60722]: DEBUG nova.virt.hardware [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 578.513571] env[60722]: DEBUG nova.virt.hardware [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 578.513571] env[60722]: DEBUG nova.virt.hardware [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 578.513571] env[60722]: DEBUG nova.virt.hardware [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 578.513571] env[60722]: DEBUG nova.virt.hardware [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 578.516053] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3066157-b432-4dfb-9314-f7f4f2926557 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 578.524754] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62d8168c-6584-4b98-b134-5ddb571b547a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 578.756019] env[60722]: DEBUG nova.policy [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '43b3512749594d5b8de9e6f600bee65b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0f7f375e7cad4025ad0de7daca3a39ac', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 580.456828] env[60722]: DEBUG nova.network.neutron [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Successfully created port: 30d46334-77b8-491a-a7ce-144b0930d12f {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 583.211966] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Acquiring lock "42da538f-82b8-4c91-93e3-1dc84a2eabda" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 583.212323] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Lock "42da538f-82b8-4c91-93e3-1dc84a2eabda" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 583.232818] env[60722]: DEBUG nova.compute.manager [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 583.305151] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 583.305151] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 583.305151] env[60722]: INFO nova.compute.claims [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 583.416328] env[60722]: DEBUG nova.network.neutron [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Successfully updated port: 30d46334-77b8-491a-a7ce-144b0930d12f {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 583.433247] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Acquiring lock "refresh_cache-a2ad54e2-c14a-4548-aedd-01668745b397" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 583.434029] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Acquired lock "refresh_cache-a2ad54e2-c14a-4548-aedd-01668745b397" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 583.434029] env[60722]: DEBUG nova.network.neutron [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 583.439892] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec89536a-1bcb-4ddc-940c-321e330036d3 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.457043] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27b2ea41-3319-4096-ad83-3f4c29a93af0 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.494711] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4123ae6a-ce6e-4782-9328-27a02fff5297 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.506362] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af1e0cd4-3e24-4554-b038-41d7225277d8 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.522959] env[60722]: DEBUG nova.compute.provider_tree [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 583.534824] env[60722]: DEBUG nova.scheduler.client.report [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 583.553642] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.250s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 583.553893] env[60722]: DEBUG nova.compute.manager [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 583.607855] env[60722]: DEBUG nova.compute.utils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 583.609461] env[60722]: DEBUG nova.compute.manager [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 583.609650] env[60722]: DEBUG nova.network.neutron [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 583.636040] env[60722]: DEBUG nova.compute.manager [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 583.719803] env[60722]: DEBUG nova.compute.manager [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 583.730129] env[60722]: DEBUG nova.network.neutron [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 583.754710] env[60722]: DEBUG nova.virt.hardware [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 583.754853] env[60722]: DEBUG nova.virt.hardware [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 583.754949] env[60722]: DEBUG nova.virt.hardware [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 583.755632] env[60722]: DEBUG nova.virt.hardware [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 583.755836] env[60722]: DEBUG nova.virt.hardware [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 583.756208] env[60722]: DEBUG nova.virt.hardware [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 583.756309] env[60722]: DEBUG nova.virt.hardware [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 583.756463] env[60722]: DEBUG nova.virt.hardware [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 583.757157] env[60722]: DEBUG nova.virt.hardware [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 583.757157] env[60722]: DEBUG nova.virt.hardware [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 583.758696] env[60722]: DEBUG nova.virt.hardware [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 583.759809] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7913cb93-f229-41b9-b924-3da4335034e9 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.774250] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d81c6ba9-4dcb-4717-9f92-eee4d021d3fa {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.227418] env[60722]: DEBUG nova.policy [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b9679f0aa54741088181cc63b789795d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8894d5d73ace4392964c535d3d3abe15', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 584.253576] env[60722]: DEBUG nova.compute.manager [req-e1697607-f52e-46ce-a2ce-f7d2acba5572 req-1a89c254-a843-4b6b-bd73-60505485b20d service nova] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Received event network-vif-plugged-30d46334-77b8-491a-a7ce-144b0930d12f {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 584.253877] env[60722]: DEBUG oslo_concurrency.lockutils [req-e1697607-f52e-46ce-a2ce-f7d2acba5572 req-1a89c254-a843-4b6b-bd73-60505485b20d service nova] Acquiring lock "a2ad54e2-c14a-4548-aedd-01668745b397-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.256095] env[60722]: DEBUG oslo_concurrency.lockutils [req-e1697607-f52e-46ce-a2ce-f7d2acba5572 req-1a89c254-a843-4b6b-bd73-60505485b20d service nova] Lock "a2ad54e2-c14a-4548-aedd-01668745b397-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.256095] env[60722]: DEBUG oslo_concurrency.lockutils [req-e1697607-f52e-46ce-a2ce-f7d2acba5572 req-1a89c254-a843-4b6b-bd73-60505485b20d service nova] Lock "a2ad54e2-c14a-4548-aedd-01668745b397-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.256241] env[60722]: DEBUG nova.compute.manager [req-e1697607-f52e-46ce-a2ce-f7d2acba5572 req-1a89c254-a843-4b6b-bd73-60505485b20d service nova] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] No waiting events found dispatching network-vif-plugged-30d46334-77b8-491a-a7ce-144b0930d12f {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 584.259878] env[60722]: WARNING nova.compute.manager [req-e1697607-f52e-46ce-a2ce-f7d2acba5572 req-1a89c254-a843-4b6b-bd73-60505485b20d service nova] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Received unexpected event network-vif-plugged-30d46334-77b8-491a-a7ce-144b0930d12f for instance with vm_state building and task_state spawning. [ 584.757102] env[60722]: DEBUG nova.network.neutron [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Updating instance_info_cache with network_info: [{"id": "30d46334-77b8-491a-a7ce-144b0930d12f", "address": "fa:16:3e:db:55:10", "network": {"id": "dacebd88-01b3-4b24-8af6-26769aea4f15", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.228", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0b91883855c8437587c531188adfc164", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap30d46334-77", "ovs_interfaceid": "30d46334-77b8-491a-a7ce-144b0930d12f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 584.773395] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Releasing lock "refresh_cache-a2ad54e2-c14a-4548-aedd-01668745b397" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 584.773861] env[60722]: DEBUG nova.compute.manager [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Instance network_info: |[{"id": "30d46334-77b8-491a-a7ce-144b0930d12f", "address": "fa:16:3e:db:55:10", "network": {"id": "dacebd88-01b3-4b24-8af6-26769aea4f15", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.228", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0b91883855c8437587c531188adfc164", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap30d46334-77", "ovs_interfaceid": "30d46334-77b8-491a-a7ce-144b0930d12f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 584.774328] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:db:55:10', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ff1f3320-df8e-49df-a412-9797a23bd173', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '30d46334-77b8-491a-a7ce-144b0930d12f', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 584.789765] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 584.793575] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-094b6002-9cc8-4fb4-999c-7077af8892f4 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.807233] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Created folder: OpenStack in parent group-v4. [ 584.807347] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Creating folder: Project (0f7f375e7cad4025ad0de7daca3a39ac). Parent ref: group-v141606. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 584.807964] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1a0561e5-f958-4ca5-a3eb-6d0c495fb565 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.822592] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Created folder: Project (0f7f375e7cad4025ad0de7daca3a39ac) in parent group-v141606. [ 584.824235] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Creating folder: Instances. Parent ref: group-v141607. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 584.824506] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5e0c58a8-8cab-441b-8b83-73d73d188df1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.834700] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Created folder: Instances in parent group-v141607. [ 584.835168] env[60722]: DEBUG oslo.service.loopingcall [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 584.837407] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 584.837713] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1457de94-8c50-4331-825b-7f4bc79a1bef {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.861319] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 584.861319] env[60722]: value = "task-565116" [ 584.861319] env[60722]: _type = "Task" [ 584.861319] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 584.870250] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565116, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 585.373973] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565116, 'name': CreateVM_Task, 'duration_secs': 0.402811} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 585.374318] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 585.471176] env[60722]: DEBUG oslo_vmware.service [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-453ad24e-52fa-4184-8a14-f0998cf5e060 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.479290] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 585.479985] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 585.481369] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 585.482355] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-95bc6183-33bc-45f2-9a69-2fe7536023f2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.490023] env[60722]: DEBUG oslo_vmware.api [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Waiting for the task: (returnval){ [ 585.490023] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52285d9f-307f-9cae-a099-2fa98670ebee" [ 585.490023] env[60722]: _type = "Task" [ 585.490023] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 585.500707] env[60722]: DEBUG oslo_vmware.api [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52285d9f-307f-9cae-a099-2fa98670ebee, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 586.001992] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 586.002508] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 586.002508] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 586.003344] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 586.003344] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 586.003344] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5a1a5570-ea5c-4f22-a196-8062999cc8d5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.024449] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 586.024759] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 586.025625] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56f5d499-2f41-45c3-a881-2e5285e797bb {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.034786] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ddd03d47-7573-4aaf-8830-cd1d549eed10 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.040493] env[60722]: DEBUG oslo_vmware.api [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Waiting for the task: (returnval){ [ 586.040493] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52855cee-a080-074b-3082-dfe46e5acf81" [ 586.040493] env[60722]: _type = "Task" [ 586.040493] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 586.050916] env[60722]: DEBUG oslo_vmware.api [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52855cee-a080-074b-3082-dfe46e5acf81, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 586.285182] env[60722]: DEBUG nova.network.neutron [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Successfully created port: f1836018-d292-4080-8c1d-c1b0ad1a3c74 {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 586.557826] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 586.558074] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Creating directory with path [datastore1] vmware_temp/f16d5ba2-a470-4ed8-96a5-8bade3e2016f/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 586.560018] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-23392d5d-4c08-4587-9b58-a577b4138feb {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.593495] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Created directory with path [datastore1] vmware_temp/f16d5ba2-a470-4ed8-96a5-8bade3e2016f/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 586.593643] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Fetch image to [datastore1] vmware_temp/f16d5ba2-a470-4ed8-96a5-8bade3e2016f/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 586.593841] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/f16d5ba2-a470-4ed8-96a5-8bade3e2016f/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 586.594623] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb5de102-2d52-4722-9653-5de11201ea14 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.607122] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-980b1722-be6c-47c4-8f59-45c6a78bd623 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.619941] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-820372a1-3cfa-40e6-8aa9-02f6604feb61 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.660131] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f944c07-1a10-4d8f-859e-89cfc94da5a5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.670325] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cb841e94-5123-435f-839a-404aaef046d4 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.762381] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 586.841657] env[60722]: DEBUG oslo_vmware.rw_handles [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f16d5ba2-a470-4ed8-96a5-8bade3e2016f/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 586.910271] env[60722]: DEBUG oslo_vmware.rw_handles [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Completed reading data from the image iterator. {{(pid=60722) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 586.910271] env[60722]: DEBUG oslo_vmware.rw_handles [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f16d5ba2-a470-4ed8-96a5-8bade3e2016f/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 587.379974] env[60722]: DEBUG nova.compute.manager [req-85ef5751-b273-4cd2-8cb4-9850d8167b6d req-b48d89a1-ca51-417f-b548-c0ea56c628e5 service nova] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Received event network-changed-30d46334-77b8-491a-a7ce-144b0930d12f {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 587.380289] env[60722]: DEBUG nova.compute.manager [req-85ef5751-b273-4cd2-8cb4-9850d8167b6d req-b48d89a1-ca51-417f-b548-c0ea56c628e5 service nova] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Refreshing instance network info cache due to event network-changed-30d46334-77b8-491a-a7ce-144b0930d12f. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 587.380372] env[60722]: DEBUG oslo_concurrency.lockutils [req-85ef5751-b273-4cd2-8cb4-9850d8167b6d req-b48d89a1-ca51-417f-b548-c0ea56c628e5 service nova] Acquiring lock "refresh_cache-a2ad54e2-c14a-4548-aedd-01668745b397" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 587.380579] env[60722]: DEBUG oslo_concurrency.lockutils [req-85ef5751-b273-4cd2-8cb4-9850d8167b6d req-b48d89a1-ca51-417f-b548-c0ea56c628e5 service nova] Acquired lock "refresh_cache-a2ad54e2-c14a-4548-aedd-01668745b397" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 587.380684] env[60722]: DEBUG nova.network.neutron [req-85ef5751-b273-4cd2-8cb4-9850d8167b6d req-b48d89a1-ca51-417f-b548-c0ea56c628e5 service nova] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Refreshing network info cache for port 30d46334-77b8-491a-a7ce-144b0930d12f {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 588.289285] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "d1623803-5152-47d0-b1a7-5e8d4ab06233" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.289285] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "d1623803-5152-47d0-b1a7-5e8d4ab06233" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.299122] env[60722]: DEBUG nova.compute.manager [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 588.353876] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.354149] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.356229] env[60722]: INFO nova.compute.claims [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 588.469982] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2480a8e6-5e40-41e3-8c72-9d6fd541347b {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.481488] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8247de2d-9af9-4741-af78-d0aac10b13f8 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.511603] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf53baa2-ebf9-4961-869e-2409e3354f03 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.519435] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afa31a75-15e4-4bf1-bb50-a836d8f051ec {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.533211] env[60722]: DEBUG nova.compute.provider_tree [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 588.541226] env[60722]: DEBUG nova.scheduler.client.report [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 588.555202] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.201s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.555708] env[60722]: DEBUG nova.compute.manager [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 588.591645] env[60722]: DEBUG nova.compute.utils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 588.597675] env[60722]: DEBUG nova.compute.manager [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 588.597675] env[60722]: DEBUG nova.network.neutron [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 588.603654] env[60722]: DEBUG nova.compute.manager [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 588.695736] env[60722]: DEBUG nova.compute.manager [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 588.724933] env[60722]: DEBUG nova.virt.hardware [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 588.725552] env[60722]: DEBUG nova.virt.hardware [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 588.725771] env[60722]: DEBUG nova.virt.hardware [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 588.726084] env[60722]: DEBUG nova.virt.hardware [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 588.726237] env[60722]: DEBUG nova.virt.hardware [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 588.726752] env[60722]: DEBUG nova.virt.hardware [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 588.726752] env[60722]: DEBUG nova.virt.hardware [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 588.726752] env[60722]: DEBUG nova.virt.hardware [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 588.727094] env[60722]: DEBUG nova.virt.hardware [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 588.727255] env[60722]: DEBUG nova.virt.hardware [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 588.727475] env[60722]: DEBUG nova.virt.hardware [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 588.728369] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0134071e-1c90-46ff-980f-cf8256dec57c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.738194] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fa4282c-ba9c-405f-81e6-532bb9d8bdb9 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 589.019156] env[60722]: DEBUG nova.policy [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dc19a92ba0b94e64955176c4a6f8e51e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9911ec5f31004b6493a91b6994b789c1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 589.575914] env[60722]: DEBUG nova.network.neutron [req-85ef5751-b273-4cd2-8cb4-9850d8167b6d req-b48d89a1-ca51-417f-b548-c0ea56c628e5 service nova] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Updated VIF entry in instance network info cache for port 30d46334-77b8-491a-a7ce-144b0930d12f. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 589.575914] env[60722]: DEBUG nova.network.neutron [req-85ef5751-b273-4cd2-8cb4-9850d8167b6d req-b48d89a1-ca51-417f-b548-c0ea56c628e5 service nova] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Updating instance_info_cache with network_info: [{"id": "30d46334-77b8-491a-a7ce-144b0930d12f", "address": "fa:16:3e:db:55:10", "network": {"id": "dacebd88-01b3-4b24-8af6-26769aea4f15", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.228", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0b91883855c8437587c531188adfc164", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap30d46334-77", "ovs_interfaceid": "30d46334-77b8-491a-a7ce-144b0930d12f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 589.586251] env[60722]: DEBUG oslo_concurrency.lockutils [req-85ef5751-b273-4cd2-8cb4-9850d8167b6d req-b48d89a1-ca51-417f-b548-c0ea56c628e5 service nova] Releasing lock "refresh_cache-a2ad54e2-c14a-4548-aedd-01668745b397" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 590.563261] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Acquiring lock "d09f5c24-d76b-4ff9-acdd-8da94d70f9cb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 590.563261] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Lock "d09f5c24-d76b-4ff9-acdd-8da94d70f9cb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 590.574673] env[60722]: DEBUG nova.compute.manager [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 590.630086] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 590.630812] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 590.633425] env[60722]: INFO nova.compute.claims [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 590.725737] env[60722]: DEBUG nova.network.neutron [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Successfully updated port: f1836018-d292-4080-8c1d-c1b0ad1a3c74 {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 590.736396] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Acquiring lock "refresh_cache-42da538f-82b8-4c91-93e3-1dc84a2eabda" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 590.736526] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Acquired lock "refresh_cache-42da538f-82b8-4c91-93e3-1dc84a2eabda" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 590.736670] env[60722]: DEBUG nova.network.neutron [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 590.773191] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34e94c61-18f4-41a3-b4d9-06ea3319791c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 590.781409] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad3de684-8fd3-4f0a-9dcd-19043faf2d7d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 590.814387] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4906e1dd-08fe-48a1-acdd-87e98f85c3b5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 590.824666] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c30f508-953d-404c-b30e-a57a8b44e3f1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 590.839620] env[60722]: DEBUG nova.compute.provider_tree [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 590.848646] env[60722]: DEBUG nova.scheduler.client.report [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 590.864365] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.234s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 590.864956] env[60722]: DEBUG nova.compute.manager [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 590.904545] env[60722]: DEBUG nova.compute.utils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 590.907271] env[60722]: DEBUG nova.network.neutron [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 590.909450] env[60722]: DEBUG nova.compute.manager [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 590.909896] env[60722]: DEBUG nova.network.neutron [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 590.916570] env[60722]: DEBUG nova.compute.manager [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 591.004141] env[60722]: DEBUG nova.compute.manager [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 591.029750] env[60722]: DEBUG nova.virt.hardware [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 591.030032] env[60722]: DEBUG nova.virt.hardware [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 591.030558] env[60722]: DEBUG nova.virt.hardware [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 591.030756] env[60722]: DEBUG nova.virt.hardware [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 591.030928] env[60722]: DEBUG nova.virt.hardware [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 591.031116] env[60722]: DEBUG nova.virt.hardware [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 591.031290] env[60722]: DEBUG nova.virt.hardware [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 591.031439] env[60722]: DEBUG nova.virt.hardware [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 591.031600] env[60722]: DEBUG nova.virt.hardware [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 591.032020] env[60722]: DEBUG nova.virt.hardware [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 591.032244] env[60722]: DEBUG nova.virt.hardware [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 591.033115] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dffa7ae0-d5c9-4e22-bb01-94c0e73afce3 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.044040] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c8561a7-8f06-41b6-9717-43e145f00d7b {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.099161] env[60722]: DEBUG nova.network.neutron [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Successfully created port: 3194faba-1b8e-4540-ad08-1eb13eb82802 {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 591.290634] env[60722]: DEBUG nova.policy [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bcf84123ffba4fa7b3e2528c44526713', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3f1f4a6c22054e81819695c8461fa7fc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 591.550730] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Acquiring lock "65901b4a-42cf-4795-abc2-b0fea1f4fee7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 591.550991] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Lock "65901b4a-42cf-4795-abc2-b0fea1f4fee7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 591.567369] env[60722]: DEBUG nova.compute.manager [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 591.658918] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 591.659291] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 591.661719] env[60722]: INFO nova.compute.claims [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 591.830657] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18da7066-4fbb-43d6-86a1-f402fb154a30 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.839423] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-811cbd92-b4b3-42de-87b3-b5518277a848 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.873869] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-366eda42-f0a2-4de9-8861-3e55b08e6493 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.884549] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2def08b2-a899-40ed-9c70-a8ff4a7dc0d9 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.899188] env[60722]: DEBUG nova.compute.provider_tree [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 591.911325] env[60722]: DEBUG nova.scheduler.client.report [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 591.939982] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 591.939982] env[60722]: DEBUG nova.compute.manager [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 592.012902] env[60722]: DEBUG nova.compute.utils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 592.014255] env[60722]: DEBUG nova.compute.manager [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 592.014418] env[60722]: DEBUG nova.network.neutron [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 592.031037] env[60722]: DEBUG nova.compute.manager [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 592.117166] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 592.119940] env[60722]: DEBUG nova.compute.manager [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 592.154349] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Getting list of instances from cluster (obj){ [ 592.154349] env[60722]: value = "domain-c8" [ 592.154349] env[60722]: _type = "ClusterComputeResource" [ 592.154349] env[60722]: } {{(pid=60722) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 592.156081] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8dfdb474-aafb-48fb-9836-a334e7025728 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.161325] env[60722]: DEBUG nova.virt.hardware [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 592.161612] env[60722]: DEBUG nova.virt.hardware [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 592.161672] env[60722]: DEBUG nova.virt.hardware [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 592.161836] env[60722]: DEBUG nova.virt.hardware [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 592.162042] env[60722]: DEBUG nova.virt.hardware [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 592.162116] env[60722]: DEBUG nova.virt.hardware [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 592.162365] env[60722]: DEBUG nova.virt.hardware [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 592.162449] env[60722]: DEBUG nova.virt.hardware [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 592.162647] env[60722]: DEBUG nova.virt.hardware [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 592.162783] env[60722]: DEBUG nova.virt.hardware [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 592.162947] env[60722]: DEBUG nova.virt.hardware [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 592.164717] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50da20e4-0f06-4fc7-9d17-3c3f21a9605c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.177581] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "1b18a8e4-eab9-4f28-bd87-a354c436b51c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 592.177923] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "1b18a8e4-eab9-4f28-bd87-a354c436b51c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 592.179282] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Got total of 1 instances {{(pid=60722) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 592.179282] env[60722]: WARNING nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] While synchronizing instance power states, found 5 instances in the database and 1 instances on the hypervisor. [ 592.179282] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Triggering sync for uuid a2ad54e2-c14a-4548-aedd-01668745b397 {{(pid=60722) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 592.179464] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Triggering sync for uuid 42da538f-82b8-4c91-93e3-1dc84a2eabda {{(pid=60722) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 592.179464] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Triggering sync for uuid d1623803-5152-47d0-b1a7-5e8d4ab06233 {{(pid=60722) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 592.179595] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Triggering sync for uuid d09f5c24-d76b-4ff9-acdd-8da94d70f9cb {{(pid=60722) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 592.179746] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Triggering sync for uuid 65901b4a-42cf-4795-abc2-b0fea1f4fee7 {{(pid=60722) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10218}} [ 592.180779] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "a2ad54e2-c14a-4548-aedd-01668745b397" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 592.181242] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "42da538f-82b8-4c91-93e3-1dc84a2eabda" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 592.181440] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "d1623803-5152-47d0-b1a7-5e8d4ab06233" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 592.182509] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "d09f5c24-d76b-4ff9-acdd-8da94d70f9cb" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 592.182509] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "65901b4a-42cf-4795-abc2-b0fea1f4fee7" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 592.182509] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 592.182509] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Getting list of instances from cluster (obj){ [ 592.182509] env[60722]: value = "domain-c8" [ 592.182509] env[60722]: _type = "ClusterComputeResource" [ 592.182509] env[60722]: } {{(pid=60722) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 592.186849] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61db27ce-e957-452b-9b4e-1cb7edca9638 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.192910] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8aab7930-7d69-4fc7-a00b-c29d3eb77fb1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.197195] env[60722]: DEBUG nova.compute.manager [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 592.213434] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Got total of 1 instances {{(pid=60722) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 592.284363] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 592.284603] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 592.287724] env[60722]: INFO nova.compute.claims [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 592.374838] env[60722]: DEBUG nova.network.neutron [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Updating instance_info_cache with network_info: [{"id": "f1836018-d292-4080-8c1d-c1b0ad1a3c74", "address": "fa:16:3e:59:2d:75", "network": {"id": "dacebd88-01b3-4b24-8af6-26769aea4f15", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.217", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0b91883855c8437587c531188adfc164", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf1836018-d2", "ovs_interfaceid": "f1836018-d292-4080-8c1d-c1b0ad1a3c74", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 592.398267] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Releasing lock "refresh_cache-42da538f-82b8-4c91-93e3-1dc84a2eabda" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 592.398553] env[60722]: DEBUG nova.compute.manager [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Instance network_info: |[{"id": "f1836018-d292-4080-8c1d-c1b0ad1a3c74", "address": "fa:16:3e:59:2d:75", "network": {"id": "dacebd88-01b3-4b24-8af6-26769aea4f15", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.217", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0b91883855c8437587c531188adfc164", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf1836018-d2", "ovs_interfaceid": "f1836018-d292-4080-8c1d-c1b0ad1a3c74", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 592.399477] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:59:2d:75', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ff1f3320-df8e-49df-a412-9797a23bd173', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f1836018-d292-4080-8c1d-c1b0ad1a3c74', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 592.406456] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Creating folder: Project (8894d5d73ace4392964c535d3d3abe15). Parent ref: group-v141606. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 592.409336] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b9b5e182-95b9-4a55-b6df-3a27af116317 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.426021] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Created folder: Project (8894d5d73ace4392964c535d3d3abe15) in parent group-v141606. [ 592.426021] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Creating folder: Instances. Parent ref: group-v141610. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 592.426021] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dceaf043-6fc9-4f42-808a-ab20a25e2419 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.439024] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Created folder: Instances in parent group-v141610. [ 592.439024] env[60722]: DEBUG oslo.service.loopingcall [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 592.439024] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 592.439024] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7786e0ff-b126-4542-98b0-c0bfc71bd137 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.462170] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 592.462170] env[60722]: value = "task-565119" [ 592.462170] env[60722]: _type = "Task" [ 592.462170] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 592.470042] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565119, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 592.511119] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bd6783b-d693-417a-a494-43080f942e6e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.517898] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-418c8757-eb38-4894-a252-0e847df7be2a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.553242] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e5f4ec6-2e66-42ab-b186-73e691a2950b {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.565718] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57be7289-4500-4678-8c02-d7b53c02c2d6 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.580115] env[60722]: DEBUG nova.compute.provider_tree [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 592.590185] env[60722]: DEBUG nova.policy [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '212b0d1ffbd44587a4477f1c201bd543', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1227381c32aa438a9029dbd9ecccd4b1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 592.592691] env[60722]: DEBUG nova.scheduler.client.report [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 592.611308] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.326s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 592.611765] env[60722]: DEBUG nova.compute.manager [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 592.653136] env[60722]: DEBUG nova.compute.utils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 592.659030] env[60722]: DEBUG nova.compute.manager [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 592.659030] env[60722]: DEBUG nova.network.neutron [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 592.674313] env[60722]: DEBUG nova.compute.manager [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 592.773026] env[60722]: DEBUG nova.compute.manager [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 592.802770] env[60722]: DEBUG nova.virt.hardware [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 592.803061] env[60722]: DEBUG nova.virt.hardware [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 592.803545] env[60722]: DEBUG nova.virt.hardware [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 592.803807] env[60722]: DEBUG nova.virt.hardware [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 592.803955] env[60722]: DEBUG nova.virt.hardware [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 592.804112] env[60722]: DEBUG nova.virt.hardware [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 592.804315] env[60722]: DEBUG nova.virt.hardware [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 592.804466] env[60722]: DEBUG nova.virt.hardware [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 592.804621] env[60722]: DEBUG nova.virt.hardware [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 592.804776] env[60722]: DEBUG nova.virt.hardware [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 592.804933] env[60722]: DEBUG nova.virt.hardware [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 592.805822] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-781d18fa-bf82-47fb-9df4-8aab3156519c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.815088] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee212e9c-5c79-4c6e-a76f-ccbd6ab08d89 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.973409] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565119, 'name': CreateVM_Task, 'duration_secs': 0.313256} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 592.973409] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 592.975066] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 592.975307] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 592.976041] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 592.976041] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3430cf93-2f9a-4eb8-a083-353f9b9616b5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.980954] env[60722]: DEBUG oslo_vmware.api [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Waiting for the task: (returnval){ [ 592.980954] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52c98e50-e76d-5545-f6ee-f62f7b169891" [ 592.980954] env[60722]: _type = "Task" [ 592.980954] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 592.993551] env[60722]: DEBUG oslo_vmware.api [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52c98e50-e76d-5545-f6ee-f62f7b169891, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 593.236585] env[60722]: DEBUG nova.policy [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dc19a92ba0b94e64955176c4a6f8e51e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9911ec5f31004b6493a91b6994b789c1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 593.493141] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 593.493362] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 593.493507] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 593.913444] env[60722]: DEBUG nova.network.neutron [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Successfully created port: bb696c50-f4e4-41f7-b7ea-1db9e799a4dd {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 593.939189] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquiring lock "bfde3558-9940-4402-bdf9-15c23c285a8f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 593.939189] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Lock "bfde3558-9940-4402-bdf9-15c23c285a8f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 593.948851] env[60722]: DEBUG nova.compute.manager [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 593.998019] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 593.998019] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 593.998019] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Starting heal instance info cache {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 593.998019] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Rebuilding the list of instances to heal {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 594.010488] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 594.011070] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 594.016483] env[60722]: INFO nova.compute.claims [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 594.023306] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 594.023306] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 594.023306] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 594.023306] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 594.023306] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 594.023523] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 594.023523] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Didn't find any instances for network info cache update. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 594.023523] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 594.023523] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 594.023995] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 594.026212] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 594.026212] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 594.026212] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 594.026212] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60722) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 594.026212] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 594.033451] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 594.181190] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf4ea485-719a-4256-824f-0b5bc4bc5938 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.190847] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37301f15-bbfe-4048-a1e1-e11171c45d73 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.229400] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41bf2029-cbed-46a4-b9a5-63f56e901846 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.233142] env[60722]: DEBUG nova.compute.manager [req-f77d8687-c700-49e0-b03b-5459fa64f6f9 req-e4350d5f-e5ff-4fcd-9b1b-fbb57c797172 service nova] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Received event network-vif-plugged-f1836018-d292-4080-8c1d-c1b0ad1a3c74 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 594.233490] env[60722]: DEBUG oslo_concurrency.lockutils [req-f77d8687-c700-49e0-b03b-5459fa64f6f9 req-e4350d5f-e5ff-4fcd-9b1b-fbb57c797172 service nova] Acquiring lock "42da538f-82b8-4c91-93e3-1dc84a2eabda-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 594.233740] env[60722]: DEBUG oslo_concurrency.lockutils [req-f77d8687-c700-49e0-b03b-5459fa64f6f9 req-e4350d5f-e5ff-4fcd-9b1b-fbb57c797172 service nova] Lock "42da538f-82b8-4c91-93e3-1dc84a2eabda-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 594.233867] env[60722]: DEBUG oslo_concurrency.lockutils [req-f77d8687-c700-49e0-b03b-5459fa64f6f9 req-e4350d5f-e5ff-4fcd-9b1b-fbb57c797172 service nova] Lock "42da538f-82b8-4c91-93e3-1dc84a2eabda-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 594.234025] env[60722]: DEBUG nova.compute.manager [req-f77d8687-c700-49e0-b03b-5459fa64f6f9 req-e4350d5f-e5ff-4fcd-9b1b-fbb57c797172 service nova] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] No waiting events found dispatching network-vif-plugged-f1836018-d292-4080-8c1d-c1b0ad1a3c74 {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 594.234181] env[60722]: WARNING nova.compute.manager [req-f77d8687-c700-49e0-b03b-5459fa64f6f9 req-e4350d5f-e5ff-4fcd-9b1b-fbb57c797172 service nova] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Received unexpected event network-vif-plugged-f1836018-d292-4080-8c1d-c1b0ad1a3c74 for instance with vm_state building and task_state spawning. [ 594.240704] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d3b4d89-09e9-4790-852e-0d88044453fe {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.256795] env[60722]: DEBUG nova.compute.provider_tree [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 594.267765] env[60722]: DEBUG nova.scheduler.client.report [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 594.284968] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.274s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 594.289933] env[60722]: DEBUG nova.compute.manager [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 594.289933] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.256s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 594.289933] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 594.290050] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60722) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 594.291254] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47ccbc53-38f6-44a1-9fbd-c6bfd350002f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.299925] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61246a8a-b169-4a02-a5ff-a0a72310bbf8 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.317093] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0caa0c4-d446-42e3-9da7-1c41abbf8fbf {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.324306] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b05ce4cf-be30-47fb-99b5-4a006e9692e0 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.329858] env[60722]: DEBUG nova.compute.utils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 594.331467] env[60722]: DEBUG nova.compute.manager [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 594.331633] env[60722]: DEBUG nova.network.neutron [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 594.361285] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181717MB free_disk=100GB free_vcpus=48 pci_devices=None {{(pid=60722) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 594.361435] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 594.361694] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 594.363407] env[60722]: DEBUG nova.compute.manager [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 594.447172] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance a2ad54e2-c14a-4548-aedd-01668745b397 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 594.447172] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 42da538f-82b8-4c91-93e3-1dc84a2eabda actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 594.447308] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance d1623803-5152-47d0-b1a7-5e8d4ab06233 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 594.447519] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance d09f5c24-d76b-4ff9-acdd-8da94d70f9cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 594.447596] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 65901b4a-42cf-4795-abc2-b0fea1f4fee7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 594.447652] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 1b18a8e4-eab9-4f28-bd87-a354c436b51c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 594.447770] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance bfde3558-9940-4402-bdf9-15c23c285a8f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 594.447970] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 594.449645] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 594.455602] env[60722]: DEBUG nova.compute.manager [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 594.483452] env[60722]: DEBUG nova.virt.hardware [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 594.484079] env[60722]: DEBUG nova.virt.hardware [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 594.484079] env[60722]: DEBUG nova.virt.hardware [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 594.484079] env[60722]: DEBUG nova.virt.hardware [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 594.485040] env[60722]: DEBUG nova.virt.hardware [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 594.485223] env[60722]: DEBUG nova.virt.hardware [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 594.486099] env[60722]: DEBUG nova.virt.hardware [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 594.486099] env[60722]: DEBUG nova.virt.hardware [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 594.486099] env[60722]: DEBUG nova.virt.hardware [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 594.486099] env[60722]: DEBUG nova.virt.hardware [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 594.486099] env[60722]: DEBUG nova.virt.hardware [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 594.489280] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b9a70a1-fb33-48af-958f-2889230198e4 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.502396] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2e25fed-f276-41b0-b33a-4a61050e9293 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.587415] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-add0b8cd-278b-46b9-af06-ea2e2a771834 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.600032] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a5041b9-1838-4cc1-9947-2c0e46437fd5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.632710] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb434740-1816-4be7-9576-e8a8a518ddf7 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.640513] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb0f1d4d-623c-4252-ba69-d3df3b6ae3f1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.654047] env[60722]: DEBUG nova.compute.provider_tree [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 594.664849] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 594.682761] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60722) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 594.682943] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.321s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 594.783742] env[60722]: DEBUG nova.policy [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd06aa1d647d84c24bb54d8cd19ff73e9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '380ddb7dbc52479fb72e82724f8b295d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 594.818068] env[60722]: DEBUG nova.network.neutron [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Successfully created port: 795fd727-775e-476e-b53d-6d712cb0d9e2 {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 595.141079] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquiring lock "bc2a1e45-2f48-4a73-bfee-69a20725a610" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 595.141363] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Lock "bc2a1e45-2f48-4a73-bfee-69a20725a610" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 595.159544] env[60722]: DEBUG nova.compute.manager [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 595.230442] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 595.230614] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 595.232177] env[60722]: INFO nova.compute.claims [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 595.465786] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7064a754-890f-4da6-bc26-e86493ac39fb {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 595.475019] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31c0000e-c6bd-4ff3-87a7-495040a39901 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 595.511658] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f814effd-e811-4111-b9a9-2e59edc52b18 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 595.521302] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bb27a6b-ff9d-4952-8fbd-46b76b883409 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 595.535092] env[60722]: DEBUG nova.compute.provider_tree [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 595.546932] env[60722]: DEBUG nova.scheduler.client.report [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 595.577327] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.346s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 595.581590] env[60722]: DEBUG nova.compute.manager [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 595.630274] env[60722]: DEBUG nova.compute.utils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 595.636798] env[60722]: DEBUG nova.compute.manager [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 595.636798] env[60722]: DEBUG nova.network.neutron [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 595.652045] env[60722]: DEBUG nova.compute.manager [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 595.758288] env[60722]: DEBUG nova.compute.manager [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 595.800473] env[60722]: DEBUG nova.virt.hardware [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 595.800631] env[60722]: DEBUG nova.virt.hardware [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 595.800710] env[60722]: DEBUG nova.virt.hardware [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 595.800889] env[60722]: DEBUG nova.virt.hardware [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 595.801659] env[60722]: DEBUG nova.virt.hardware [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 595.801895] env[60722]: DEBUG nova.virt.hardware [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 595.802493] env[60722]: DEBUG nova.virt.hardware [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 595.802751] env[60722]: DEBUG nova.virt.hardware [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 595.802815] env[60722]: DEBUG nova.virt.hardware [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 595.802952] env[60722]: DEBUG nova.virt.hardware [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 595.803130] env[60722]: DEBUG nova.virt.hardware [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 595.804526] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d59827a-9776-4253-aea0-349f4f566ee2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 595.813644] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afceb9e6-c2df-4de0-960c-e69711ba49ff {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.042697] env[60722]: DEBUG nova.network.neutron [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Successfully created port: ab6639e6-d484-4b11-ba0a-470a82b1d444 {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 596.176832] env[60722]: DEBUG nova.policy [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c8dacf7015bd4aeb8e6b7277a2f0a337', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '03406ae6612c4ceabe8c940d457db3fe', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 596.249612] env[60722]: DEBUG nova.network.neutron [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Successfully updated port: 3194faba-1b8e-4540-ad08-1eb13eb82802 {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 596.294504] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "refresh_cache-d1623803-5152-47d0-b1a7-5e8d4ab06233" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 596.294662] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquired lock "refresh_cache-d1623803-5152-47d0-b1a7-5e8d4ab06233" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 596.294818] env[60722]: DEBUG nova.network.neutron [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 596.432139] env[60722]: DEBUG nova.network.neutron [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 596.644230] env[60722]: DEBUG nova.network.neutron [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Successfully created port: cadacd56-203f-44bf-bd8f-9b659a9e085a {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 597.605489] env[60722]: DEBUG nova.network.neutron [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Updating instance_info_cache with network_info: [{"id": "3194faba-1b8e-4540-ad08-1eb13eb82802", "address": "fa:16:3e:39:d1:a2", "network": {"id": "ac22e575-6c87-499f-ace7-1e9de353b8fc", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-255045585-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9911ec5f31004b6493a91b6994b789c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d859f07-052d-4a69-bdf1-24261a6a6daa", "external-id": "nsx-vlan-transportzone-684", "segmentation_id": 684, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3194faba-1b", "ovs_interfaceid": "3194faba-1b8e-4540-ad08-1eb13eb82802", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 597.621570] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Releasing lock "refresh_cache-d1623803-5152-47d0-b1a7-5e8d4ab06233" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 597.621808] env[60722]: DEBUG nova.compute.manager [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Instance network_info: |[{"id": "3194faba-1b8e-4540-ad08-1eb13eb82802", "address": "fa:16:3e:39:d1:a2", "network": {"id": "ac22e575-6c87-499f-ace7-1e9de353b8fc", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-255045585-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9911ec5f31004b6493a91b6994b789c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d859f07-052d-4a69-bdf1-24261a6a6daa", "external-id": "nsx-vlan-transportzone-684", "segmentation_id": 684, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3194faba-1b", "ovs_interfaceid": "3194faba-1b8e-4540-ad08-1eb13eb82802", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 597.622199] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:39:d1:a2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2d859f07-052d-4a69-bdf1-24261a6a6daa', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3194faba-1b8e-4540-ad08-1eb13eb82802', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 597.634049] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Creating folder: Project (9911ec5f31004b6493a91b6994b789c1). Parent ref: group-v141606. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 597.635951] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6fcb06dc-9276-40a4-9ffc-fc01b0fac467 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.649579] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Created folder: Project (9911ec5f31004b6493a91b6994b789c1) in parent group-v141606. [ 597.649907] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Creating folder: Instances. Parent ref: group-v141613. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 597.650225] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0881f19e-83bb-4666-aa08-48c37a4f4d54 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.662396] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Created folder: Instances in parent group-v141613. [ 597.662396] env[60722]: DEBUG oslo.service.loopingcall [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 597.662598] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 597.662865] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-709cbcb5-476b-4ef4-a605-b0eac5a1af7c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.691261] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 597.691261] env[60722]: value = "task-565122" [ 597.691261] env[60722]: _type = "Task" [ 597.691261] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 597.698172] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565122, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 598.203748] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "e93b8d4b-6286-410a-870a-02fa7e59d90d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 598.203748] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "e93b8d4b-6286-410a-870a-02fa7e59d90d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 598.216288] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565122, 'name': CreateVM_Task, 'duration_secs': 0.305784} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 598.216464] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 598.217204] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 598.217367] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 598.217677] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 598.217947] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3aa0e386-0775-4e0f-b9aa-7c443f7bc17e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.224033] env[60722]: DEBUG oslo_vmware.api [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Waiting for the task: (returnval){ [ 598.224033] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52e29cb7-1cb9-5f5d-b00e-83d5ebd781cb" [ 598.224033] env[60722]: _type = "Task" [ 598.224033] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 598.224322] env[60722]: DEBUG nova.compute.manager [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 598.239052] env[60722]: DEBUG oslo_vmware.api [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52e29cb7-1cb9-5f5d-b00e-83d5ebd781cb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 598.287012] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 598.287366] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 598.288970] env[60722]: INFO nova.compute.claims [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 598.456119] env[60722]: DEBUG nova.network.neutron [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Successfully updated port: bb696c50-f4e4-41f7-b7ea-1db9e799a4dd {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 598.473036] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Acquiring lock "refresh_cache-d09f5c24-d76b-4ff9-acdd-8da94d70f9cb" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 598.473036] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Acquired lock "refresh_cache-d09f5c24-d76b-4ff9-acdd-8da94d70f9cb" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 598.473036] env[60722]: DEBUG nova.network.neutron [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 598.570221] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fea49ba-b1c2-4775-892e-f986d92120c9 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.583277] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b198aa13-9c3d-4bce-b32f-e0fa6a856a0a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.615965] env[60722]: DEBUG nova.network.neutron [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Successfully created port: 1c6ba95e-3958-4e7b-bd82-1e063f5d5d6e {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 598.618405] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c2b51d7-4d62-4b67-a0ea-19f3a837c4de {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.634433] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquiring lock "93268011-e1f2-4041-b4df-473c06d3f1eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 598.634717] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Lock "93268011-e1f2-4041-b4df-473c06d3f1eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 598.637549] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d27d6a3-ff50-41e7-8b99-326686bc9022 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.645458] env[60722]: DEBUG nova.network.neutron [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 598.647669] env[60722]: DEBUG nova.compute.manager [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 598.660160] env[60722]: DEBUG nova.compute.provider_tree [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 598.670076] env[60722]: DEBUG nova.scheduler.client.report [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 598.692875] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.405s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 598.696550] env[60722]: DEBUG nova.compute.manager [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 598.717526] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 598.717768] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 598.719472] env[60722]: INFO nova.compute.claims [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 598.738014] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 598.738264] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 598.738603] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 598.753815] env[60722]: DEBUG nova.compute.utils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 598.755045] env[60722]: DEBUG nova.compute.manager [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 598.755045] env[60722]: DEBUG nova.network.neutron [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 598.764271] env[60722]: DEBUG nova.compute.manager [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 598.841518] env[60722]: DEBUG nova.compute.manager [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 598.874659] env[60722]: DEBUG nova.virt.hardware [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:41Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='84',id=12,is_public=True,memory_mb=192,name='m1.micro',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 598.874890] env[60722]: DEBUG nova.virt.hardware [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 598.875050] env[60722]: DEBUG nova.virt.hardware [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 598.875229] env[60722]: DEBUG nova.virt.hardware [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 598.875366] env[60722]: DEBUG nova.virt.hardware [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 598.875506] env[60722]: DEBUG nova.virt.hardware [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 598.875708] env[60722]: DEBUG nova.virt.hardware [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 598.875864] env[60722]: DEBUG nova.virt.hardware [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 598.876036] env[60722]: DEBUG nova.virt.hardware [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 598.876195] env[60722]: DEBUG nova.virt.hardware [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 598.876501] env[60722]: DEBUG nova.virt.hardware [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 598.877635] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dad2af3-eeee-4518-a8f7-014acb2f2fc8 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.894203] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c354277e-3f9c-467b-9f38-d670effe4f32 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.981466] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52a9015a-d853-4ceb-a906-dcd75f4515fa {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.989782] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ab7d65a-488f-4669-8b41-7a0f7519b147 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 599.041402] env[60722]: DEBUG nova.policy [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dc19a92ba0b94e64955176c4a6f8e51e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9911ec5f31004b6493a91b6994b789c1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 599.050903] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c55c9c6e-83b7-4918-b2d4-45eb5fb0d9ad {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 599.059963] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c040e1c-5264-453a-9e8d-811f8d3c0643 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 599.077617] env[60722]: DEBUG nova.compute.provider_tree [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 599.096830] env[60722]: DEBUG nova.scheduler.client.report [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 599.121157] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.403s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 599.121727] env[60722]: DEBUG nova.compute.manager [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 599.161918] env[60722]: DEBUG nova.compute.utils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 599.163211] env[60722]: DEBUG nova.compute.manager [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 599.163942] env[60722]: DEBUG nova.network.neutron [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 599.175847] env[60722]: DEBUG nova.compute.manager [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 599.249670] env[60722]: DEBUG nova.compute.manager [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 599.285655] env[60722]: DEBUG nova.virt.hardware [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 599.285655] env[60722]: DEBUG nova.virt.hardware [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 599.285655] env[60722]: DEBUG nova.virt.hardware [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 599.285969] env[60722]: DEBUG nova.virt.hardware [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 599.285969] env[60722]: DEBUG nova.virt.hardware [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 599.286165] env[60722]: DEBUG nova.virt.hardware [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 599.286386] env[60722]: DEBUG nova.virt.hardware [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 599.286552] env[60722]: DEBUG nova.virt.hardware [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 599.286716] env[60722]: DEBUG nova.virt.hardware [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 599.286878] env[60722]: DEBUG nova.virt.hardware [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 599.287060] env[60722]: DEBUG nova.virt.hardware [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 599.288519] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38948813-3666-4bdd-8149-2c67e1239210 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 599.299572] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e3f9518-de13-45d0-8f53-6d513c463eb5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 599.387292] env[60722]: DEBUG nova.network.neutron [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Successfully updated port: 795fd727-775e-476e-b53d-6d712cb0d9e2 {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 599.399920] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Acquiring lock "refresh_cache-65901b4a-42cf-4795-abc2-b0fea1f4fee7" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 599.400416] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Acquired lock "refresh_cache-65901b4a-42cf-4795-abc2-b0fea1f4fee7" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 599.400464] env[60722]: DEBUG nova.network.neutron [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 599.578183] env[60722]: DEBUG nova.policy [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd06aa1d647d84c24bb54d8cd19ff73e9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '380ddb7dbc52479fb72e82724f8b295d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 599.594273] env[60722]: DEBUG nova.network.neutron [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 599.814135] env[60722]: DEBUG nova.network.neutron [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Updating instance_info_cache with network_info: [{"id": "bb696c50-f4e4-41f7-b7ea-1db9e799a4dd", "address": "fa:16:3e:f5:85:26", "network": {"id": "0a60db10-ef9e-4f2e-8b4a-b5004ec2531e", "bridge": "br-int", "label": "tempest-ServersTestJSON-1154734442-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3f1f4a6c22054e81819695c8461fa7fc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f3a2eb5-353f-45c5-a73b-869626f4bb13", "external-id": "nsx-vlan-transportzone-411", "segmentation_id": 411, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbb696c50-f4", "ovs_interfaceid": "bb696c50-f4e4-41f7-b7ea-1db9e799a4dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 599.829876] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Releasing lock "refresh_cache-d09f5c24-d76b-4ff9-acdd-8da94d70f9cb" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 599.830193] env[60722]: DEBUG nova.compute.manager [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Instance network_info: |[{"id": "bb696c50-f4e4-41f7-b7ea-1db9e799a4dd", "address": "fa:16:3e:f5:85:26", "network": {"id": "0a60db10-ef9e-4f2e-8b4a-b5004ec2531e", "bridge": "br-int", "label": "tempest-ServersTestJSON-1154734442-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3f1f4a6c22054e81819695c8461fa7fc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f3a2eb5-353f-45c5-a73b-869626f4bb13", "external-id": "nsx-vlan-transportzone-411", "segmentation_id": 411, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbb696c50-f4", "ovs_interfaceid": "bb696c50-f4e4-41f7-b7ea-1db9e799a4dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 599.831037] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f5:85:26', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9f3a2eb5-353f-45c5-a73b-869626f4bb13', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'bb696c50-f4e4-41f7-b7ea-1db9e799a4dd', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 599.841661] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Creating folder: Project (3f1f4a6c22054e81819695c8461fa7fc). Parent ref: group-v141606. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 599.842544] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ee20ea54-1b62-43e5-814b-a05c7fe10ef9 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 599.856718] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Created folder: Project (3f1f4a6c22054e81819695c8461fa7fc) in parent group-v141606. [ 599.856946] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Creating folder: Instances. Parent ref: group-v141616. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 599.857229] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d9536094-8c77-493c-be4a-a7db70501bc4 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 599.867594] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Created folder: Instances in parent group-v141616. [ 599.867986] env[60722]: DEBUG oslo.service.loopingcall [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 599.868261] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 599.868488] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c2e04b37-ca7f-4d1e-bee2-6422d47bdb76 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 599.891541] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 599.891541] env[60722]: value = "task-565125" [ 599.891541] env[60722]: _type = "Task" [ 599.891541] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 599.900126] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565125, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 599.902349] env[60722]: DEBUG nova.network.neutron [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Successfully updated port: ab6639e6-d484-4b11-ba0a-470a82b1d444 {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 599.904932] env[60722]: DEBUG nova.compute.manager [req-47f21744-f43e-42fb-a7ce-f78bc1bad36f req-102bd9da-4c58-4b7e-a3dd-057981a7077d service nova] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Received event network-changed-f1836018-d292-4080-8c1d-c1b0ad1a3c74 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 599.905208] env[60722]: DEBUG nova.compute.manager [req-47f21744-f43e-42fb-a7ce-f78bc1bad36f req-102bd9da-4c58-4b7e-a3dd-057981a7077d service nova] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Refreshing instance network info cache due to event network-changed-f1836018-d292-4080-8c1d-c1b0ad1a3c74. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 599.905459] env[60722]: DEBUG oslo_concurrency.lockutils [req-47f21744-f43e-42fb-a7ce-f78bc1bad36f req-102bd9da-4c58-4b7e-a3dd-057981a7077d service nova] Acquiring lock "refresh_cache-42da538f-82b8-4c91-93e3-1dc84a2eabda" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 599.905537] env[60722]: DEBUG oslo_concurrency.lockutils [req-47f21744-f43e-42fb-a7ce-f78bc1bad36f req-102bd9da-4c58-4b7e-a3dd-057981a7077d service nova] Acquired lock "refresh_cache-42da538f-82b8-4c91-93e3-1dc84a2eabda" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 599.905761] env[60722]: DEBUG nova.network.neutron [req-47f21744-f43e-42fb-a7ce-f78bc1bad36f req-102bd9da-4c58-4b7e-a3dd-057981a7077d service nova] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Refreshing network info cache for port f1836018-d292-4080-8c1d-c1b0ad1a3c74 {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 599.916050] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "refresh_cache-1b18a8e4-eab9-4f28-bd87-a354c436b51c" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 599.916977] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquired lock "refresh_cache-1b18a8e4-eab9-4f28-bd87-a354c436b51c" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 599.916977] env[60722]: DEBUG nova.network.neutron [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 600.053678] env[60722]: DEBUG nova.network.neutron [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 600.408813] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565125, 'name': CreateVM_Task, 'duration_secs': 0.313413} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 600.408813] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 600.409388] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 600.409767] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 600.410606] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 600.413691] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d62590cc-02aa-4631-b960-2af17d0644f7 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.425497] env[60722]: DEBUG oslo_vmware.api [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Waiting for the task: (returnval){ [ 600.425497] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52b35e9f-92d2-ca3d-d604-b0675e6e0b04" [ 600.425497] env[60722]: _type = "Task" [ 600.425497] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 600.442109] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 600.442357] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 600.442556] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 600.859713] env[60722]: DEBUG nova.network.neutron [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Successfully updated port: cadacd56-203f-44bf-bd8f-9b659a9e085a {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 600.872885] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquiring lock "refresh_cache-bfde3558-9940-4402-bdf9-15c23c285a8f" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 600.873644] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquired lock "refresh_cache-bfde3558-9940-4402-bdf9-15c23c285a8f" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 600.873858] env[60722]: DEBUG nova.network.neutron [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 600.917287] env[60722]: DEBUG nova.network.neutron [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Updating instance_info_cache with network_info: [{"id": "795fd727-775e-476e-b53d-6d712cb0d9e2", "address": "fa:16:3e:f4:80:7e", "network": {"id": "dacebd88-01b3-4b24-8af6-26769aea4f15", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.85", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0b91883855c8437587c531188adfc164", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap795fd727-77", "ovs_interfaceid": "795fd727-775e-476e-b53d-6d712cb0d9e2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 600.937588] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Releasing lock "refresh_cache-65901b4a-42cf-4795-abc2-b0fea1f4fee7" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 600.937993] env[60722]: DEBUG nova.compute.manager [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Instance network_info: |[{"id": "795fd727-775e-476e-b53d-6d712cb0d9e2", "address": "fa:16:3e:f4:80:7e", "network": {"id": "dacebd88-01b3-4b24-8af6-26769aea4f15", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.85", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0b91883855c8437587c531188adfc164", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap795fd727-77", "ovs_interfaceid": "795fd727-775e-476e-b53d-6d712cb0d9e2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 600.938284] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f4:80:7e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ff1f3320-df8e-49df-a412-9797a23bd173', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '795fd727-775e-476e-b53d-6d712cb0d9e2', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 600.946714] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Creating folder: Project (1227381c32aa438a9029dbd9ecccd4b1). Parent ref: group-v141606. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 600.948291] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4d9db9ae-9def-4988-98d5-81d3d697be00 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.961922] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Created folder: Project (1227381c32aa438a9029dbd9ecccd4b1) in parent group-v141606. [ 600.961922] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Creating folder: Instances. Parent ref: group-v141619. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 600.961922] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fe6609a9-661b-43a8-96e1-8437ad2720c1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.969361] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Created folder: Instances in parent group-v141619. [ 600.969578] env[60722]: DEBUG oslo.service.loopingcall [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 600.969747] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 600.969930] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-05149a5a-2edb-4dbb-bfdb-1b98d2c42789 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.991129] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 600.991129] env[60722]: value = "task-565128" [ 600.991129] env[60722]: _type = "Task" [ 600.991129] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 600.999019] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565128, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 601.011190] env[60722]: DEBUG nova.network.neutron [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 601.318976] env[60722]: DEBUG nova.network.neutron [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Successfully created port: b28573a9-8f69-4a37-8f53-9a3b5374aa59 {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 601.324945] env[60722]: DEBUG nova.network.neutron [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Updating instance_info_cache with network_info: [{"id": "ab6639e6-d484-4b11-ba0a-470a82b1d444", "address": "fa:16:3e:74:3f:ef", "network": {"id": "ac22e575-6c87-499f-ace7-1e9de353b8fc", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-255045585-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9911ec5f31004b6493a91b6994b789c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d859f07-052d-4a69-bdf1-24261a6a6daa", "external-id": "nsx-vlan-transportzone-684", "segmentation_id": 684, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapab6639e6-d4", "ovs_interfaceid": "ab6639e6-d484-4b11-ba0a-470a82b1d444", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 601.344828] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Releasing lock "refresh_cache-1b18a8e4-eab9-4f28-bd87-a354c436b51c" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 601.345193] env[60722]: DEBUG nova.compute.manager [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Instance network_info: |[{"id": "ab6639e6-d484-4b11-ba0a-470a82b1d444", "address": "fa:16:3e:74:3f:ef", "network": {"id": "ac22e575-6c87-499f-ace7-1e9de353b8fc", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-255045585-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9911ec5f31004b6493a91b6994b789c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d859f07-052d-4a69-bdf1-24261a6a6daa", "external-id": "nsx-vlan-transportzone-684", "segmentation_id": 684, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapab6639e6-d4", "ovs_interfaceid": "ab6639e6-d484-4b11-ba0a-470a82b1d444", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 601.346708] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:74:3f:ef', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2d859f07-052d-4a69-bdf1-24261a6a6daa', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ab6639e6-d484-4b11-ba0a-470a82b1d444', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 601.358009] env[60722]: DEBUG oslo.service.loopingcall [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 601.358791] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 601.359268] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-dc3c5370-df55-40f7-ad15-7610ca290dc9 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.388220] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 601.388220] env[60722]: value = "task-565129" [ 601.388220] env[60722]: _type = "Task" [ 601.388220] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 601.394619] env[60722]: DEBUG nova.network.neutron [req-47f21744-f43e-42fb-a7ce-f78bc1bad36f req-102bd9da-4c58-4b7e-a3dd-057981a7077d service nova] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Updated VIF entry in instance network info cache for port f1836018-d292-4080-8c1d-c1b0ad1a3c74. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 601.395027] env[60722]: DEBUG nova.network.neutron [req-47f21744-f43e-42fb-a7ce-f78bc1bad36f req-102bd9da-4c58-4b7e-a3dd-057981a7077d service nova] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Updating instance_info_cache with network_info: [{"id": "f1836018-d292-4080-8c1d-c1b0ad1a3c74", "address": "fa:16:3e:59:2d:75", "network": {"id": "dacebd88-01b3-4b24-8af6-26769aea4f15", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.217", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0b91883855c8437587c531188adfc164", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf1836018-d2", "ovs_interfaceid": "f1836018-d292-4080-8c1d-c1b0ad1a3c74", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 601.404831] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565129, 'name': CreateVM_Task} progress is 6%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 601.418183] env[60722]: DEBUG oslo_concurrency.lockutils [req-47f21744-f43e-42fb-a7ce-f78bc1bad36f req-102bd9da-4c58-4b7e-a3dd-057981a7077d service nova] Releasing lock "refresh_cache-42da538f-82b8-4c91-93e3-1dc84a2eabda" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 601.418560] env[60722]: DEBUG nova.compute.manager [req-47f21744-f43e-42fb-a7ce-f78bc1bad36f req-102bd9da-4c58-4b7e-a3dd-057981a7077d service nova] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Received event network-vif-plugged-3194faba-1b8e-4540-ad08-1eb13eb82802 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 601.418831] env[60722]: DEBUG oslo_concurrency.lockutils [req-47f21744-f43e-42fb-a7ce-f78bc1bad36f req-102bd9da-4c58-4b7e-a3dd-057981a7077d service nova] Acquiring lock "d1623803-5152-47d0-b1a7-5e8d4ab06233-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 601.420598] env[60722]: DEBUG oslo_concurrency.lockutils [req-47f21744-f43e-42fb-a7ce-f78bc1bad36f req-102bd9da-4c58-4b7e-a3dd-057981a7077d service nova] Lock "d1623803-5152-47d0-b1a7-5e8d4ab06233-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 601.420805] env[60722]: DEBUG oslo_concurrency.lockutils [req-47f21744-f43e-42fb-a7ce-f78bc1bad36f req-102bd9da-4c58-4b7e-a3dd-057981a7077d service nova] Lock "d1623803-5152-47d0-b1a7-5e8d4ab06233-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 601.420976] env[60722]: DEBUG nova.compute.manager [req-47f21744-f43e-42fb-a7ce-f78bc1bad36f req-102bd9da-4c58-4b7e-a3dd-057981a7077d service nova] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] No waiting events found dispatching network-vif-plugged-3194faba-1b8e-4540-ad08-1eb13eb82802 {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 601.421562] env[60722]: WARNING nova.compute.manager [req-47f21744-f43e-42fb-a7ce-f78bc1bad36f req-102bd9da-4c58-4b7e-a3dd-057981a7077d service nova] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Received unexpected event network-vif-plugged-3194faba-1b8e-4540-ad08-1eb13eb82802 for instance with vm_state building and task_state spawning. [ 601.513317] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565128, 'name': CreateVM_Task} progress is 25%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 601.562788] env[60722]: DEBUG nova.network.neutron [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Successfully created port: c49cc32c-c002-4195-8e7b-7a5ec96c2efe {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 601.899656] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565129, 'name': CreateVM_Task, 'duration_secs': 0.317172} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 601.899656] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 601.904078] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 601.904078] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 601.904078] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 601.904078] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c50d8263-47a4-4e70-91dd-9cccb18fb308 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.912342] env[60722]: DEBUG oslo_vmware.api [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Waiting for the task: (returnval){ [ 601.912342] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]521e5f66-a9f8-c749-b1f1-21576fe9fe17" [ 601.912342] env[60722]: _type = "Task" [ 601.912342] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 601.920306] env[60722]: DEBUG oslo_vmware.api [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]521e5f66-a9f8-c749-b1f1-21576fe9fe17, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 601.970103] env[60722]: DEBUG nova.network.neutron [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Updating instance_info_cache with network_info: [{"id": "cadacd56-203f-44bf-bd8f-9b659a9e085a", "address": "fa:16:3e:72:0f:98", "network": {"id": "5415af15-bc0f-4c4d-9917-5173f42d133c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1874207349-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "380ddb7dbc52479fb72e82724f8b295d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3cf748a8-7ae0-4dca-817d-e727c30d72f4", "external-id": "nsx-vlan-transportzone-853", "segmentation_id": 853, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcadacd56-20", "ovs_interfaceid": "cadacd56-203f-44bf-bd8f-9b659a9e085a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 601.994248] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Releasing lock "refresh_cache-bfde3558-9940-4402-bdf9-15c23c285a8f" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 601.994618] env[60722]: DEBUG nova.compute.manager [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Instance network_info: |[{"id": "cadacd56-203f-44bf-bd8f-9b659a9e085a", "address": "fa:16:3e:72:0f:98", "network": {"id": "5415af15-bc0f-4c4d-9917-5173f42d133c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1874207349-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "380ddb7dbc52479fb72e82724f8b295d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3cf748a8-7ae0-4dca-817d-e727c30d72f4", "external-id": "nsx-vlan-transportzone-853", "segmentation_id": 853, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcadacd56-20", "ovs_interfaceid": "cadacd56-203f-44bf-bd8f-9b659a9e085a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 601.999246] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:72:0f:98', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3cf748a8-7ae0-4dca-817d-e727c30d72f4', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cadacd56-203f-44bf-bd8f-9b659a9e085a', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 602.007695] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Creating folder: Project (380ddb7dbc52479fb72e82724f8b295d). Parent ref: group-v141606. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 602.008469] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-506f3f85-26e3-4fab-b01f-c082abc3d622 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 602.017486] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565128, 'name': CreateVM_Task} progress is 25%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 602.019089] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Created folder: Project (380ddb7dbc52479fb72e82724f8b295d) in parent group-v141606. [ 602.019404] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Creating folder: Instances. Parent ref: group-v141623. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 602.019541] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-267ad079-95b1-4ee3-ae50-90eff09da89b {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 602.029772] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Created folder: Instances in parent group-v141623. [ 602.030218] env[60722]: DEBUG oslo.service.loopingcall [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 602.030399] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 602.030808] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2af4f183-f288-4295-8053-ec43d57ddf46 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 602.054850] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 602.054850] env[60722]: value = "task-565132" [ 602.054850] env[60722]: _type = "Task" [ 602.054850] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 602.064023] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565132, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 602.243105] env[60722]: DEBUG nova.network.neutron [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Successfully updated port: 1c6ba95e-3958-4e7b-bd82-1e063f5d5d6e {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 602.257936] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquiring lock "refresh_cache-bc2a1e45-2f48-4a73-bfee-69a20725a610" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 602.257936] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquired lock "refresh_cache-bc2a1e45-2f48-4a73-bfee-69a20725a610" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 602.257936] env[60722]: DEBUG nova.network.neutron [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 602.331476] env[60722]: DEBUG nova.network.neutron [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 602.423712] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 602.424455] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 602.424455] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 602.509525] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565128, 'name': CreateVM_Task} progress is 25%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 602.553573] env[60722]: DEBUG nova.compute.manager [req-eeb02d36-1baf-4400-aed7-21e4286a6765 req-ac665674-6fc0-4bf5-81fd-b0ad00d3760e service nova] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Received event network-vif-plugged-795fd727-775e-476e-b53d-6d712cb0d9e2 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 602.553573] env[60722]: DEBUG oslo_concurrency.lockutils [req-eeb02d36-1baf-4400-aed7-21e4286a6765 req-ac665674-6fc0-4bf5-81fd-b0ad00d3760e service nova] Acquiring lock "65901b4a-42cf-4795-abc2-b0fea1f4fee7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 602.553838] env[60722]: DEBUG oslo_concurrency.lockutils [req-eeb02d36-1baf-4400-aed7-21e4286a6765 req-ac665674-6fc0-4bf5-81fd-b0ad00d3760e service nova] Lock "65901b4a-42cf-4795-abc2-b0fea1f4fee7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 602.556225] env[60722]: DEBUG oslo_concurrency.lockutils [req-eeb02d36-1baf-4400-aed7-21e4286a6765 req-ac665674-6fc0-4bf5-81fd-b0ad00d3760e service nova] Lock "65901b4a-42cf-4795-abc2-b0fea1f4fee7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.556225] env[60722]: DEBUG nova.compute.manager [req-eeb02d36-1baf-4400-aed7-21e4286a6765 req-ac665674-6fc0-4bf5-81fd-b0ad00d3760e service nova] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] No waiting events found dispatching network-vif-plugged-795fd727-775e-476e-b53d-6d712cb0d9e2 {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 602.556225] env[60722]: WARNING nova.compute.manager [req-eeb02d36-1baf-4400-aed7-21e4286a6765 req-ac665674-6fc0-4bf5-81fd-b0ad00d3760e service nova] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Received unexpected event network-vif-plugged-795fd727-775e-476e-b53d-6d712cb0d9e2 for instance with vm_state building and task_state spawning. [ 602.576323] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565132, 'name': CreateVM_Task, 'duration_secs': 0.312471} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 602.576323] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 602.576323] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 602.576323] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 602.578784] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 602.579050] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1b5f1194-d8a4-4566-a93e-981030d21504 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 602.585595] env[60722]: DEBUG oslo_vmware.api [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Waiting for the task: (returnval){ [ 602.585595] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52ff4207-c752-3b0e-9cff-948dc88385ac" [ 602.585595] env[60722]: _type = "Task" [ 602.585595] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 602.598795] env[60722]: DEBUG oslo_vmware.api [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52ff4207-c752-3b0e-9cff-948dc88385ac, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 603.008036] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565128, 'name': CreateVM_Task} progress is 25%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 603.098935] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 603.099086] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 603.099295] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 603.162305] env[60722]: DEBUG nova.network.neutron [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Updating instance_info_cache with network_info: [{"id": "1c6ba95e-3958-4e7b-bd82-1e063f5d5d6e", "address": "fa:16:3e:20:bc:70", "network": {"id": "d8ffbf62-b735-4b9e-b07a-14cf3426b943", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-127230270-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "03406ae6612c4ceabe8c940d457db3fe", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e30245c5-78f5-48e6-b504-c6c21f5a9b45", "external-id": "nsx-vlan-transportzone-409", "segmentation_id": 409, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1c6ba95e-39", "ovs_interfaceid": "1c6ba95e-3958-4e7b-bd82-1e063f5d5d6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 603.179628] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Releasing lock "refresh_cache-bc2a1e45-2f48-4a73-bfee-69a20725a610" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 603.180353] env[60722]: DEBUG nova.compute.manager [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Instance network_info: |[{"id": "1c6ba95e-3958-4e7b-bd82-1e063f5d5d6e", "address": "fa:16:3e:20:bc:70", "network": {"id": "d8ffbf62-b735-4b9e-b07a-14cf3426b943", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-127230270-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "03406ae6612c4ceabe8c940d457db3fe", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e30245c5-78f5-48e6-b504-c6c21f5a9b45", "external-id": "nsx-vlan-transportzone-409", "segmentation_id": 409, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1c6ba95e-39", "ovs_interfaceid": "1c6ba95e-3958-4e7b-bd82-1e063f5d5d6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 603.180970] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:20:bc:70', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e30245c5-78f5-48e6-b504-c6c21f5a9b45', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1c6ba95e-3958-4e7b-bd82-1e063f5d5d6e', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 603.196108] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Creating folder: Project (03406ae6612c4ceabe8c940d457db3fe). Parent ref: group-v141606. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 603.196108] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bcbbb4d1-e5b3-49b7-a30a-7f7eacf56924 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.208842] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Created folder: Project (03406ae6612c4ceabe8c940d457db3fe) in parent group-v141606. [ 603.209204] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Creating folder: Instances. Parent ref: group-v141626. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 603.209490] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6045404a-3316-4de4-bf6c-35e84558c126 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.220707] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Created folder: Instances in parent group-v141626. [ 603.220948] env[60722]: DEBUG oslo.service.loopingcall [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 603.221140] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 603.221329] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-612ebe80-64a4-485f-9fde-124a9a9e9234 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.245301] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 603.245301] env[60722]: value = "task-565135" [ 603.245301] env[60722]: _type = "Task" [ 603.245301] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 603.255058] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565135, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 603.506475] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565128, 'name': CreateVM_Task} progress is 25%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 603.511148] env[60722]: DEBUG nova.network.neutron [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Successfully updated port: c49cc32c-c002-4195-8e7b-7a5ec96c2efe {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 603.519436] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquiring lock "refresh_cache-93268011-e1f2-4041-b4df-473c06d3f1eb" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 603.519575] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquired lock "refresh_cache-93268011-e1f2-4041-b4df-473c06d3f1eb" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 603.519733] env[60722]: DEBUG nova.network.neutron [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 603.561587] env[60722]: DEBUG nova.network.neutron [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 603.760693] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565135, 'name': CreateVM_Task, 'duration_secs': 0.273065} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 603.760693] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 603.760693] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 603.760693] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 603.761083] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 603.761253] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c2139eed-b47a-44df-bc61-3e1f00b122ec {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.768283] env[60722]: DEBUG oslo_vmware.api [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Waiting for the task: (returnval){ [ 603.768283] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52ba64b5-976d-e351-9e76-88fb7f633d61" [ 603.768283] env[60722]: _type = "Task" [ 603.768283] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 603.776366] env[60722]: DEBUG oslo_vmware.api [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52ba64b5-976d-e351-9e76-88fb7f633d61, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 603.777550] env[60722]: DEBUG nova.network.neutron [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Successfully updated port: b28573a9-8f69-4a37-8f53-9a3b5374aa59 {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 603.787985] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "refresh_cache-e93b8d4b-6286-410a-870a-02fa7e59d90d" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 603.788188] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquired lock "refresh_cache-e93b8d4b-6286-410a-870a-02fa7e59d90d" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 603.788297] env[60722]: DEBUG nova.network.neutron [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 603.850424] env[60722]: DEBUG nova.network.neutron [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Updating instance_info_cache with network_info: [{"id": "c49cc32c-c002-4195-8e7b-7a5ec96c2efe", "address": "fa:16:3e:1c:f3:c9", "network": {"id": "5415af15-bc0f-4c4d-9917-5173f42d133c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1874207349-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "380ddb7dbc52479fb72e82724f8b295d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3cf748a8-7ae0-4dca-817d-e727c30d72f4", "external-id": "nsx-vlan-transportzone-853", "segmentation_id": 853, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc49cc32c-c0", "ovs_interfaceid": "c49cc32c-c002-4195-8e7b-7a5ec96c2efe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 603.864482] env[60722]: DEBUG nova.network.neutron [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 603.868381] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Releasing lock "refresh_cache-93268011-e1f2-4041-b4df-473c06d3f1eb" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 603.868653] env[60722]: DEBUG nova.compute.manager [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Instance network_info: |[{"id": "c49cc32c-c002-4195-8e7b-7a5ec96c2efe", "address": "fa:16:3e:1c:f3:c9", "network": {"id": "5415af15-bc0f-4c4d-9917-5173f42d133c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1874207349-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "380ddb7dbc52479fb72e82724f8b295d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3cf748a8-7ae0-4dca-817d-e727c30d72f4", "external-id": "nsx-vlan-transportzone-853", "segmentation_id": 853, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc49cc32c-c0", "ovs_interfaceid": "c49cc32c-c002-4195-8e7b-7a5ec96c2efe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 603.871110] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1c:f3:c9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3cf748a8-7ae0-4dca-817d-e727c30d72f4', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c49cc32c-c002-4195-8e7b-7a5ec96c2efe', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 603.878581] env[60722]: DEBUG oslo.service.loopingcall [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 603.879116] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 603.879393] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8e875f03-6c66-468f-a9cf-a47e67109821 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 603.901902] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 603.901902] env[60722]: value = "task-565136" [ 603.901902] env[60722]: _type = "Task" [ 603.901902] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 603.912393] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565136, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 604.011153] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565128, 'name': CreateVM_Task} progress is 25%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 604.283535] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 604.283982] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 604.284385] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 604.371114] env[60722]: DEBUG nova.network.neutron [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Updating instance_info_cache with network_info: [{"id": "b28573a9-8f69-4a37-8f53-9a3b5374aa59", "address": "fa:16:3e:ba:3c:15", "network": {"id": "ac22e575-6c87-499f-ace7-1e9de353b8fc", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-255045585-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9911ec5f31004b6493a91b6994b789c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d859f07-052d-4a69-bdf1-24261a6a6daa", "external-id": "nsx-vlan-transportzone-684", "segmentation_id": 684, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb28573a9-8f", "ovs_interfaceid": "b28573a9-8f69-4a37-8f53-9a3b5374aa59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 604.385787] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Releasing lock "refresh_cache-e93b8d4b-6286-410a-870a-02fa7e59d90d" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 604.386202] env[60722]: DEBUG nova.compute.manager [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Instance network_info: |[{"id": "b28573a9-8f69-4a37-8f53-9a3b5374aa59", "address": "fa:16:3e:ba:3c:15", "network": {"id": "ac22e575-6c87-499f-ace7-1e9de353b8fc", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-255045585-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9911ec5f31004b6493a91b6994b789c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d859f07-052d-4a69-bdf1-24261a6a6daa", "external-id": "nsx-vlan-transportzone-684", "segmentation_id": 684, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb28573a9-8f", "ovs_interfaceid": "b28573a9-8f69-4a37-8f53-9a3b5374aa59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 604.386473] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ba:3c:15', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2d859f07-052d-4a69-bdf1-24261a6a6daa', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b28573a9-8f69-4a37-8f53-9a3b5374aa59', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 604.398987] env[60722]: DEBUG oslo.service.loopingcall [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 604.399197] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 604.399412] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3e5dc2e4-d61d-4f23-818c-10a1b5a3c720 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.431471] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 604.431471] env[60722]: value = "task-565137" [ 604.431471] env[60722]: _type = "Task" [ 604.431471] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 604.431979] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565136, 'name': CreateVM_Task, 'duration_secs': 0.331161} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 604.432275] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 604.438131] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 604.438131] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 604.438131] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 604.438131] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6ac548ee-0c68-4f96-b08d-8815578b6269 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.442560] env[60722]: DEBUG oslo_vmware.api [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Waiting for the task: (returnval){ [ 604.442560] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52b4bd26-cb13-dc17-df61-0e6726b334e4" [ 604.442560] env[60722]: _type = "Task" [ 604.442560] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 604.446456] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565137, 'name': CreateVM_Task} progress is 6%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 604.453839] env[60722]: DEBUG nova.compute.manager [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Received event network-changed-3194faba-1b8e-4540-ad08-1eb13eb82802 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 604.453839] env[60722]: DEBUG nova.compute.manager [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Refreshing instance network info cache due to event network-changed-3194faba-1b8e-4540-ad08-1eb13eb82802. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 604.454022] env[60722]: DEBUG oslo_concurrency.lockutils [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] Acquiring lock "refresh_cache-d1623803-5152-47d0-b1a7-5e8d4ab06233" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 604.454167] env[60722]: DEBUG oslo_concurrency.lockutils [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] Acquired lock "refresh_cache-d1623803-5152-47d0-b1a7-5e8d4ab06233" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 604.454337] env[60722]: DEBUG nova.network.neutron [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Refreshing network info cache for port 3194faba-1b8e-4540-ad08-1eb13eb82802 {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 604.465970] env[60722]: DEBUG oslo_vmware.api [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52b4bd26-cb13-dc17-df61-0e6726b334e4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 604.505901] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565128, 'name': CreateVM_Task} progress is 99%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 604.948428] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565137, 'name': CreateVM_Task, 'duration_secs': 0.314069} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 604.954282] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 604.954282] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 604.961650] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 604.961650] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 604.961650] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 604.961802] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 604.962032] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 604.962319] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-635437d5-7db0-48f8-9f6d-1cb33019c0ee {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.971656] env[60722]: DEBUG oslo_vmware.api [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Waiting for the task: (returnval){ [ 604.971656] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]5229f8cf-b340-ba78-0c3f-18840ab34818" [ 604.971656] env[60722]: _type = "Task" [ 604.971656] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 604.982840] env[60722]: DEBUG oslo_vmware.api [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]5229f8cf-b340-ba78-0c3f-18840ab34818, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 605.011070] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565128, 'name': CreateVM_Task, 'duration_secs': 3.527608} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 605.011070] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 605.011070] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 605.123649] env[60722]: DEBUG nova.network.neutron [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Updated VIF entry in instance network info cache for port 3194faba-1b8e-4540-ad08-1eb13eb82802. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 605.124029] env[60722]: DEBUG nova.network.neutron [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Updating instance_info_cache with network_info: [{"id": "3194faba-1b8e-4540-ad08-1eb13eb82802", "address": "fa:16:3e:39:d1:a2", "network": {"id": "ac22e575-6c87-499f-ace7-1e9de353b8fc", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-255045585-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9911ec5f31004b6493a91b6994b789c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d859f07-052d-4a69-bdf1-24261a6a6daa", "external-id": "nsx-vlan-transportzone-684", "segmentation_id": 684, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3194faba-1b", "ovs_interfaceid": "3194faba-1b8e-4540-ad08-1eb13eb82802", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 605.134460] env[60722]: DEBUG oslo_concurrency.lockutils [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] Releasing lock "refresh_cache-d1623803-5152-47d0-b1a7-5e8d4ab06233" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 605.134563] env[60722]: DEBUG nova.compute.manager [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Received event network-vif-plugged-bb696c50-f4e4-41f7-b7ea-1db9e799a4dd {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 605.135040] env[60722]: DEBUG oslo_concurrency.lockutils [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] Acquiring lock "d09f5c24-d76b-4ff9-acdd-8da94d70f9cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.135040] env[60722]: DEBUG oslo_concurrency.lockutils [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] Lock "d09f5c24-d76b-4ff9-acdd-8da94d70f9cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.135152] env[60722]: DEBUG oslo_concurrency.lockutils [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] Lock "d09f5c24-d76b-4ff9-acdd-8da94d70f9cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.135277] env[60722]: DEBUG nova.compute.manager [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] No waiting events found dispatching network-vif-plugged-bb696c50-f4e4-41f7-b7ea-1db9e799a4dd {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 605.135434] env[60722]: WARNING nova.compute.manager [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Received unexpected event network-vif-plugged-bb696c50-f4e4-41f7-b7ea-1db9e799a4dd for instance with vm_state building and task_state spawning. [ 605.136219] env[60722]: DEBUG nova.compute.manager [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Received event network-changed-bb696c50-f4e4-41f7-b7ea-1db9e799a4dd {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 605.136219] env[60722]: DEBUG nova.compute.manager [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Refreshing instance network info cache due to event network-changed-bb696c50-f4e4-41f7-b7ea-1db9e799a4dd. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 605.136219] env[60722]: DEBUG oslo_concurrency.lockutils [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] Acquiring lock "refresh_cache-d09f5c24-d76b-4ff9-acdd-8da94d70f9cb" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 605.136219] env[60722]: DEBUG oslo_concurrency.lockutils [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] Acquired lock "refresh_cache-d09f5c24-d76b-4ff9-acdd-8da94d70f9cb" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 605.136219] env[60722]: DEBUG nova.network.neutron [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Refreshing network info cache for port bb696c50-f4e4-41f7-b7ea-1db9e799a4dd {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 605.486396] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 605.486658] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 605.486877] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 605.487098] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 605.487406] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 605.487651] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-30153827-5fda-4239-bd09-8755edba4ef8 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.493641] env[60722]: DEBUG oslo_vmware.api [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Waiting for the task: (returnval){ [ 605.493641] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]522d98be-9547-6ea3-917b-96d4993f3e85" [ 605.493641] env[60722]: _type = "Task" [ 605.493641] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 605.502820] env[60722]: DEBUG oslo_vmware.api [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]522d98be-9547-6ea3-917b-96d4993f3e85, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 605.868124] env[60722]: DEBUG nova.network.neutron [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Updated VIF entry in instance network info cache for port bb696c50-f4e4-41f7-b7ea-1db9e799a4dd. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 605.868124] env[60722]: DEBUG nova.network.neutron [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Updating instance_info_cache with network_info: [{"id": "bb696c50-f4e4-41f7-b7ea-1db9e799a4dd", "address": "fa:16:3e:f5:85:26", "network": {"id": "0a60db10-ef9e-4f2e-8b4a-b5004ec2531e", "bridge": "br-int", "label": "tempest-ServersTestJSON-1154734442-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3f1f4a6c22054e81819695c8461fa7fc", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9f3a2eb5-353f-45c5-a73b-869626f4bb13", "external-id": "nsx-vlan-transportzone-411", "segmentation_id": 411, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbb696c50-f4", "ovs_interfaceid": "bb696c50-f4e4-41f7-b7ea-1db9e799a4dd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 605.881794] env[60722]: DEBUG oslo_concurrency.lockutils [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] Releasing lock "refresh_cache-d09f5c24-d76b-4ff9-acdd-8da94d70f9cb" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 605.883962] env[60722]: DEBUG nova.compute.manager [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Received event network-vif-plugged-ab6639e6-d484-4b11-ba0a-470a82b1d444 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 605.883962] env[60722]: DEBUG oslo_concurrency.lockutils [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] Acquiring lock "1b18a8e4-eab9-4f28-bd87-a354c436b51c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.883962] env[60722]: DEBUG oslo_concurrency.lockutils [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] Lock "1b18a8e4-eab9-4f28-bd87-a354c436b51c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.883962] env[60722]: DEBUG oslo_concurrency.lockutils [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] Lock "1b18a8e4-eab9-4f28-bd87-a354c436b51c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.886504] env[60722]: DEBUG nova.compute.manager [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] No waiting events found dispatching network-vif-plugged-ab6639e6-d484-4b11-ba0a-470a82b1d444 {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 605.886504] env[60722]: WARNING nova.compute.manager [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Received unexpected event network-vif-plugged-ab6639e6-d484-4b11-ba0a-470a82b1d444 for instance with vm_state building and task_state spawning. [ 605.886504] env[60722]: DEBUG nova.compute.manager [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Received event network-changed-ab6639e6-d484-4b11-ba0a-470a82b1d444 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 605.886504] env[60722]: DEBUG nova.compute.manager [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Refreshing instance network info cache due to event network-changed-ab6639e6-d484-4b11-ba0a-470a82b1d444. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 605.886504] env[60722]: DEBUG oslo_concurrency.lockutils [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] Acquiring lock "refresh_cache-1b18a8e4-eab9-4f28-bd87-a354c436b51c" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 605.887084] env[60722]: DEBUG oslo_concurrency.lockutils [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] Acquired lock "refresh_cache-1b18a8e4-eab9-4f28-bd87-a354c436b51c" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 605.887084] env[60722]: DEBUG nova.network.neutron [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Refreshing network info cache for port ab6639e6-d484-4b11-ba0a-470a82b1d444 {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 606.008690] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 606.009526] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 606.009526] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 606.381884] env[60722]: DEBUG nova.network.neutron [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Updated VIF entry in instance network info cache for port ab6639e6-d484-4b11-ba0a-470a82b1d444. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 606.382337] env[60722]: DEBUG nova.network.neutron [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Updating instance_info_cache with network_info: [{"id": "ab6639e6-d484-4b11-ba0a-470a82b1d444", "address": "fa:16:3e:74:3f:ef", "network": {"id": "ac22e575-6c87-499f-ace7-1e9de353b8fc", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-255045585-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9911ec5f31004b6493a91b6994b789c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d859f07-052d-4a69-bdf1-24261a6a6daa", "external-id": "nsx-vlan-transportzone-684", "segmentation_id": 684, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapab6639e6-d4", "ovs_interfaceid": "ab6639e6-d484-4b11-ba0a-470a82b1d444", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 606.402182] env[60722]: DEBUG oslo_concurrency.lockutils [req-083f4f43-4542-45e3-80a5-5b8a43a08e7d req-68158866-459a-446b-be4a-952f3c3e1137 service nova] Releasing lock "refresh_cache-1b18a8e4-eab9-4f28-bd87-a354c436b51c" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 606.619902] env[60722]: DEBUG nova.compute.manager [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Received event network-changed-795fd727-775e-476e-b53d-6d712cb0d9e2 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 606.620501] env[60722]: DEBUG nova.compute.manager [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Refreshing instance network info cache due to event network-changed-795fd727-775e-476e-b53d-6d712cb0d9e2. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 606.621295] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Acquiring lock "refresh_cache-65901b4a-42cf-4795-abc2-b0fea1f4fee7" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 606.621495] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Acquired lock "refresh_cache-65901b4a-42cf-4795-abc2-b0fea1f4fee7" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 606.621829] env[60722]: DEBUG nova.network.neutron [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Refreshing network info cache for port 795fd727-775e-476e-b53d-6d712cb0d9e2 {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 606.962266] env[60722]: DEBUG nova.network.neutron [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Updated VIF entry in instance network info cache for port 795fd727-775e-476e-b53d-6d712cb0d9e2. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 606.969031] env[60722]: DEBUG nova.network.neutron [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Updating instance_info_cache with network_info: [{"id": "795fd727-775e-476e-b53d-6d712cb0d9e2", "address": "fa:16:3e:f4:80:7e", "network": {"id": "dacebd88-01b3-4b24-8af6-26769aea4f15", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.85", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "0b91883855c8437587c531188adfc164", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap795fd727-77", "ovs_interfaceid": "795fd727-775e-476e-b53d-6d712cb0d9e2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 606.982218] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Releasing lock "refresh_cache-65901b4a-42cf-4795-abc2-b0fea1f4fee7" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 606.982460] env[60722]: DEBUG nova.compute.manager [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Received event network-vif-plugged-cadacd56-203f-44bf-bd8f-9b659a9e085a {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 606.982695] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Acquiring lock "bfde3558-9940-4402-bdf9-15c23c285a8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 606.982873] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Lock "bfde3558-9940-4402-bdf9-15c23c285a8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 606.983039] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Lock "bfde3558-9940-4402-bdf9-15c23c285a8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 606.983200] env[60722]: DEBUG nova.compute.manager [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] No waiting events found dispatching network-vif-plugged-cadacd56-203f-44bf-bd8f-9b659a9e085a {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 606.983359] env[60722]: WARNING nova.compute.manager [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Received unexpected event network-vif-plugged-cadacd56-203f-44bf-bd8f-9b659a9e085a for instance with vm_state building and task_state spawning. [ 606.983517] env[60722]: DEBUG nova.compute.manager [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Received event network-changed-cadacd56-203f-44bf-bd8f-9b659a9e085a {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 606.983800] env[60722]: DEBUG nova.compute.manager [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Refreshing instance network info cache due to event network-changed-cadacd56-203f-44bf-bd8f-9b659a9e085a. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 606.984039] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Acquiring lock "refresh_cache-bfde3558-9940-4402-bdf9-15c23c285a8f" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 606.984842] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Acquired lock "refresh_cache-bfde3558-9940-4402-bdf9-15c23c285a8f" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 606.985071] env[60722]: DEBUG nova.network.neutron [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Refreshing network info cache for port cadacd56-203f-44bf-bd8f-9b659a9e085a {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 607.402267] env[60722]: DEBUG nova.network.neutron [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Updated VIF entry in instance network info cache for port cadacd56-203f-44bf-bd8f-9b659a9e085a. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 607.405262] env[60722]: DEBUG nova.network.neutron [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Updating instance_info_cache with network_info: [{"id": "cadacd56-203f-44bf-bd8f-9b659a9e085a", "address": "fa:16:3e:72:0f:98", "network": {"id": "5415af15-bc0f-4c4d-9917-5173f42d133c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1874207349-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "380ddb7dbc52479fb72e82724f8b295d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3cf748a8-7ae0-4dca-817d-e727c30d72f4", "external-id": "nsx-vlan-transportzone-853", "segmentation_id": 853, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcadacd56-20", "ovs_interfaceid": "cadacd56-203f-44bf-bd8f-9b659a9e085a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 607.418373] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Releasing lock "refresh_cache-bfde3558-9940-4402-bdf9-15c23c285a8f" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 607.418373] env[60722]: DEBUG nova.compute.manager [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Received event network-vif-plugged-1c6ba95e-3958-4e7b-bd82-1e063f5d5d6e {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 607.418373] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Acquiring lock "bc2a1e45-2f48-4a73-bfee-69a20725a610-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 607.418373] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Lock "bc2a1e45-2f48-4a73-bfee-69a20725a610-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 607.418838] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Lock "bc2a1e45-2f48-4a73-bfee-69a20725a610-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 607.418838] env[60722]: DEBUG nova.compute.manager [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] No waiting events found dispatching network-vif-plugged-1c6ba95e-3958-4e7b-bd82-1e063f5d5d6e {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 607.418838] env[60722]: WARNING nova.compute.manager [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Received unexpected event network-vif-plugged-1c6ba95e-3958-4e7b-bd82-1e063f5d5d6e for instance with vm_state building and task_state spawning. [ 607.419068] env[60722]: DEBUG nova.compute.manager [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Received event network-changed-1c6ba95e-3958-4e7b-bd82-1e063f5d5d6e {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 607.419068] env[60722]: DEBUG nova.compute.manager [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Refreshing instance network info cache due to event network-changed-1c6ba95e-3958-4e7b-bd82-1e063f5d5d6e. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 607.419303] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Acquiring lock "refresh_cache-bc2a1e45-2f48-4a73-bfee-69a20725a610" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 607.419362] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Acquired lock "refresh_cache-bc2a1e45-2f48-4a73-bfee-69a20725a610" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 607.420761] env[60722]: DEBUG nova.network.neutron [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Refreshing network info cache for port 1c6ba95e-3958-4e7b-bd82-1e063f5d5d6e {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 608.110638] env[60722]: DEBUG nova.network.neutron [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Updated VIF entry in instance network info cache for port 1c6ba95e-3958-4e7b-bd82-1e063f5d5d6e. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 608.110638] env[60722]: DEBUG nova.network.neutron [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Updating instance_info_cache with network_info: [{"id": "1c6ba95e-3958-4e7b-bd82-1e063f5d5d6e", "address": "fa:16:3e:20:bc:70", "network": {"id": "d8ffbf62-b735-4b9e-b07a-14cf3426b943", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-127230270-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "03406ae6612c4ceabe8c940d457db3fe", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e30245c5-78f5-48e6-b504-c6c21f5a9b45", "external-id": "nsx-vlan-transportzone-409", "segmentation_id": 409, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1c6ba95e-39", "ovs_interfaceid": "1c6ba95e-3958-4e7b-bd82-1e063f5d5d6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 608.123661] env[60722]: DEBUG nova.compute.manager [req-18ef133d-d108-4948-bcad-d0ccd751d3e0 req-54503cfa-9b2e-4b81-a2c1-31398a82a45f service nova] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Received event network-vif-plugged-b28573a9-8f69-4a37-8f53-9a3b5374aa59 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 608.123661] env[60722]: DEBUG oslo_concurrency.lockutils [req-18ef133d-d108-4948-bcad-d0ccd751d3e0 req-54503cfa-9b2e-4b81-a2c1-31398a82a45f service nova] Acquiring lock "e93b8d4b-6286-410a-870a-02fa7e59d90d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.123661] env[60722]: DEBUG oslo_concurrency.lockutils [req-18ef133d-d108-4948-bcad-d0ccd751d3e0 req-54503cfa-9b2e-4b81-a2c1-31398a82a45f service nova] Lock "e93b8d4b-6286-410a-870a-02fa7e59d90d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.123661] env[60722]: DEBUG oslo_concurrency.lockutils [req-18ef133d-d108-4948-bcad-d0ccd751d3e0 req-54503cfa-9b2e-4b81-a2c1-31398a82a45f service nova] Lock "e93b8d4b-6286-410a-870a-02fa7e59d90d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.123891] env[60722]: DEBUG nova.compute.manager [req-18ef133d-d108-4948-bcad-d0ccd751d3e0 req-54503cfa-9b2e-4b81-a2c1-31398a82a45f service nova] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] No waiting events found dispatching network-vif-plugged-b28573a9-8f69-4a37-8f53-9a3b5374aa59 {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 608.123891] env[60722]: WARNING nova.compute.manager [req-18ef133d-d108-4948-bcad-d0ccd751d3e0 req-54503cfa-9b2e-4b81-a2c1-31398a82a45f service nova] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Received unexpected event network-vif-plugged-b28573a9-8f69-4a37-8f53-9a3b5374aa59 for instance with vm_state building and task_state spawning. [ 608.124162] env[60722]: DEBUG nova.compute.manager [req-18ef133d-d108-4948-bcad-d0ccd751d3e0 req-54503cfa-9b2e-4b81-a2c1-31398a82a45f service nova] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Received event network-changed-b28573a9-8f69-4a37-8f53-9a3b5374aa59 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 608.126634] env[60722]: DEBUG nova.compute.manager [req-18ef133d-d108-4948-bcad-d0ccd751d3e0 req-54503cfa-9b2e-4b81-a2c1-31398a82a45f service nova] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Refreshing instance network info cache due to event network-changed-b28573a9-8f69-4a37-8f53-9a3b5374aa59. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 608.127371] env[60722]: DEBUG oslo_concurrency.lockutils [req-18ef133d-d108-4948-bcad-d0ccd751d3e0 req-54503cfa-9b2e-4b81-a2c1-31398a82a45f service nova] Acquiring lock "refresh_cache-e93b8d4b-6286-410a-870a-02fa7e59d90d" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 608.127371] env[60722]: DEBUG oslo_concurrency.lockutils [req-18ef133d-d108-4948-bcad-d0ccd751d3e0 req-54503cfa-9b2e-4b81-a2c1-31398a82a45f service nova] Acquired lock "refresh_cache-e93b8d4b-6286-410a-870a-02fa7e59d90d" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 608.127371] env[60722]: DEBUG nova.network.neutron [req-18ef133d-d108-4948-bcad-d0ccd751d3e0 req-54503cfa-9b2e-4b81-a2c1-31398a82a45f service nova] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Refreshing network info cache for port b28573a9-8f69-4a37-8f53-9a3b5374aa59 {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 608.130624] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Releasing lock "refresh_cache-bc2a1e45-2f48-4a73-bfee-69a20725a610" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 608.136028] env[60722]: DEBUG nova.compute.manager [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Received event network-vif-plugged-c49cc32c-c002-4195-8e7b-7a5ec96c2efe {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 608.136028] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Acquiring lock "93268011-e1f2-4041-b4df-473c06d3f1eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.136028] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Lock "93268011-e1f2-4041-b4df-473c06d3f1eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.136028] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Lock "93268011-e1f2-4041-b4df-473c06d3f1eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.136396] env[60722]: DEBUG nova.compute.manager [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] No waiting events found dispatching network-vif-plugged-c49cc32c-c002-4195-8e7b-7a5ec96c2efe {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 608.136396] env[60722]: WARNING nova.compute.manager [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Received unexpected event network-vif-plugged-c49cc32c-c002-4195-8e7b-7a5ec96c2efe for instance with vm_state building and task_state spawning. [ 608.136396] env[60722]: DEBUG nova.compute.manager [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Received event network-changed-c49cc32c-c002-4195-8e7b-7a5ec96c2efe {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 608.136396] env[60722]: DEBUG nova.compute.manager [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Refreshing instance network info cache due to event network-changed-c49cc32c-c002-4195-8e7b-7a5ec96c2efe. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 608.136396] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Acquiring lock "refresh_cache-93268011-e1f2-4041-b4df-473c06d3f1eb" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 608.136674] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Acquired lock "refresh_cache-93268011-e1f2-4041-b4df-473c06d3f1eb" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 608.136674] env[60722]: DEBUG nova.network.neutron [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Refreshing network info cache for port c49cc32c-c002-4195-8e7b-7a5ec96c2efe {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 608.529805] env[60722]: DEBUG nova.network.neutron [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Updated VIF entry in instance network info cache for port c49cc32c-c002-4195-8e7b-7a5ec96c2efe. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 608.529805] env[60722]: DEBUG nova.network.neutron [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Updating instance_info_cache with network_info: [{"id": "c49cc32c-c002-4195-8e7b-7a5ec96c2efe", "address": "fa:16:3e:1c:f3:c9", "network": {"id": "5415af15-bc0f-4c4d-9917-5173f42d133c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1874207349-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "380ddb7dbc52479fb72e82724f8b295d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3cf748a8-7ae0-4dca-817d-e727c30d72f4", "external-id": "nsx-vlan-transportzone-853", "segmentation_id": 853, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc49cc32c-c0", "ovs_interfaceid": "c49cc32c-c002-4195-8e7b-7a5ec96c2efe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 608.545873] env[60722]: DEBUG oslo_concurrency.lockutils [req-2241e22d-3b64-4ca2-bc50-f423e2bbfab0 req-e99b7a85-4b7f-4e3a-a18b-4bb5b0054227 service nova] Releasing lock "refresh_cache-93268011-e1f2-4041-b4df-473c06d3f1eb" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 608.891385] env[60722]: DEBUG nova.network.neutron [req-18ef133d-d108-4948-bcad-d0ccd751d3e0 req-54503cfa-9b2e-4b81-a2c1-31398a82a45f service nova] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Updated VIF entry in instance network info cache for port b28573a9-8f69-4a37-8f53-9a3b5374aa59. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 608.892666] env[60722]: DEBUG nova.network.neutron [req-18ef133d-d108-4948-bcad-d0ccd751d3e0 req-54503cfa-9b2e-4b81-a2c1-31398a82a45f service nova] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Updating instance_info_cache with network_info: [{"id": "b28573a9-8f69-4a37-8f53-9a3b5374aa59", "address": "fa:16:3e:ba:3c:15", "network": {"id": "ac22e575-6c87-499f-ace7-1e9de353b8fc", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-255045585-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9911ec5f31004b6493a91b6994b789c1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d859f07-052d-4a69-bdf1-24261a6a6daa", "external-id": "nsx-vlan-transportzone-684", "segmentation_id": 684, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb28573a9-8f", "ovs_interfaceid": "b28573a9-8f69-4a37-8f53-9a3b5374aa59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 608.903821] env[60722]: DEBUG oslo_concurrency.lockutils [req-18ef133d-d108-4948-bcad-d0ccd751d3e0 req-54503cfa-9b2e-4b81-a2c1-31398a82a45f service nova] Releasing lock "refresh_cache-e93b8d4b-6286-410a-870a-02fa7e59d90d" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 635.958018] env[60722]: WARNING oslo_vmware.rw_handles [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 635.958018] env[60722]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 635.958018] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 635.958018] env[60722]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 635.958018] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 635.958018] env[60722]: ERROR oslo_vmware.rw_handles response.begin() [ 635.958018] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 635.958018] env[60722]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 635.958018] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 635.958018] env[60722]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 635.958018] env[60722]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 635.958018] env[60722]: ERROR oslo_vmware.rw_handles [ 635.958018] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Downloaded image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to vmware_temp/f16d5ba2-a470-4ed8-96a5-8bade3e2016f/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 635.958704] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Caching image {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 635.958704] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Copying Virtual Disk [datastore1] vmware_temp/f16d5ba2-a470-4ed8-96a5-8bade3e2016f/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk to [datastore1] vmware_temp/f16d5ba2-a470-4ed8-96a5-8bade3e2016f/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk {{(pid=60722) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 635.958704] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-cdca270a-4e36-460e-8ffb-6ac7336e7daa {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 635.970468] env[60722]: DEBUG oslo_vmware.api [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Waiting for the task: (returnval){ [ 635.970468] env[60722]: value = "task-565138" [ 635.970468] env[60722]: _type = "Task" [ 635.970468] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 635.980597] env[60722]: DEBUG oslo_vmware.api [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Task: {'id': task-565138, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 636.479618] env[60722]: DEBUG oslo_vmware.exceptions [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Fault InvalidArgument not matched. {{(pid=60722) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 636.479851] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 636.481184] env[60722]: ERROR nova.compute.manager [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 636.481184] env[60722]: Faults: ['InvalidArgument'] [ 636.481184] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Traceback (most recent call last): [ 636.481184] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 636.481184] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] yield resources [ 636.481184] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 636.481184] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] self.driver.spawn(context, instance, image_meta, [ 636.481184] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 636.481184] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] self._vmops.spawn(context, instance, image_meta, injected_files, [ 636.481184] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 636.481184] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] self._fetch_image_if_missing(context, vi) [ 636.481184] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 636.481623] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] image_cache(vi, tmp_image_ds_loc) [ 636.481623] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 636.481623] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] vm_util.copy_virtual_disk( [ 636.481623] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 636.481623] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] session._wait_for_task(vmdk_copy_task) [ 636.481623] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 636.481623] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] return self.wait_for_task(task_ref) [ 636.481623] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 636.481623] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] return evt.wait() [ 636.481623] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 636.481623] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] result = hub.switch() [ 636.481623] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 636.481623] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] return self.greenlet.switch() [ 636.482052] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 636.482052] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] self.f(*self.args, **self.kw) [ 636.482052] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 636.482052] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] raise exceptions.translate_fault(task_info.error) [ 636.482052] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 636.482052] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Faults: ['InvalidArgument'] [ 636.482052] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] [ 636.482052] env[60722]: INFO nova.compute.manager [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Terminating instance [ 636.484148] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 636.484148] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 636.484148] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ae28cca7-1e7e-47ba-aeb7-e44309d3eded {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.486179] env[60722]: DEBUG nova.compute.manager [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 636.486407] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 636.487282] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae8cc2b2-fdfb-42c8-9d73-79319b1fe1af {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.494100] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 636.494356] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7e78c548-34e2-4218-b9e6-b9a528d50b2b {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.496631] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 636.496891] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 636.497796] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d7d260ae-b47c-4ef5-a401-6e70727470ca {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.502932] env[60722]: DEBUG oslo_vmware.api [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Waiting for the task: (returnval){ [ 636.502932] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]520199c7-0258-ed28-b213-25c607c9c9bf" [ 636.502932] env[60722]: _type = "Task" [ 636.502932] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 636.510961] env[60722]: DEBUG oslo_vmware.api [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]520199c7-0258-ed28-b213-25c607c9c9bf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 636.577823] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 636.577937] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 636.578665] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Deleting the datastore file [datastore1] a2ad54e2-c14a-4548-aedd-01668745b397 {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 636.578665] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c949e837-611f-4a3e-8284-cf61f1241cfa {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.586475] env[60722]: DEBUG oslo_vmware.api [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Waiting for the task: (returnval){ [ 636.586475] env[60722]: value = "task-565140" [ 636.586475] env[60722]: _type = "Task" [ 636.586475] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 636.594978] env[60722]: DEBUG oslo_vmware.api [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Task: {'id': task-565140, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 637.018893] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 637.019236] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Creating directory with path [datastore1] vmware_temp/3a5e3f43-c1e6-45fb-9b83-b14428720abb/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 637.019483] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cb8adb93-577b-497c-88de-2f9313d22500 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.032584] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Created directory with path [datastore1] vmware_temp/3a5e3f43-c1e6-45fb-9b83-b14428720abb/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 637.033326] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Fetch image to [datastore1] vmware_temp/3a5e3f43-c1e6-45fb-9b83-b14428720abb/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 637.033326] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/3a5e3f43-c1e6-45fb-9b83-b14428720abb/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 637.033838] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9fe17a43-f3dd-4e7e-ab3a-53f271051258 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.048586] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99e9d2d2-6b11-447b-a1a7-47d93845a648 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.058537] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f88b21e-94d4-4b35-86f4-7ea013ced697 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.094663] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b818d87-78df-41d6-91e7-d07b0a6ed741 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.103806] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-dc90a664-b937-4ea1-9862-42c559df4876 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.105772] env[60722]: DEBUG oslo_vmware.api [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Task: {'id': task-565140, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084093} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 637.106181] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 637.106181] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 637.106483] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 637.106552] env[60722]: INFO nova.compute.manager [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Took 0.62 seconds to destroy the instance on the hypervisor. [ 637.112871] env[60722]: DEBUG nova.compute.claims [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 637.112871] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 637.112871] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 637.195616] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 637.297305] env[60722]: DEBUG oslo_vmware.rw_handles [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3a5e3f43-c1e6-45fb-9b83-b14428720abb/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 637.360394] env[60722]: DEBUG oslo_vmware.rw_handles [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Completed reading data from the image iterator. {{(pid=60722) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 637.360394] env[60722]: DEBUG oslo_vmware.rw_handles [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3a5e3f43-c1e6-45fb-9b83-b14428720abb/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 637.427840] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2d1c6f5-e633-47cf-833a-07d52c2ccf4a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.437522] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f865befb-287c-4894-80d9-3908c0769354 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.472289] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03d836c4-283e-4a52-bc10-c41996eac62d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.480445] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-632b1fce-453e-48fc-ae45-1cf3c5f906ea {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.494328] env[60722]: DEBUG nova.compute.provider_tree [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 637.505252] env[60722]: DEBUG nova.scheduler.client.report [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 637.527209] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.414s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 637.531317] env[60722]: ERROR nova.compute.manager [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 637.531317] env[60722]: Faults: ['InvalidArgument'] [ 637.531317] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Traceback (most recent call last): [ 637.531317] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 637.531317] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] self.driver.spawn(context, instance, image_meta, [ 637.531317] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 637.531317] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] self._vmops.spawn(context, instance, image_meta, injected_files, [ 637.531317] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 637.531317] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] self._fetch_image_if_missing(context, vi) [ 637.531317] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 637.531317] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] image_cache(vi, tmp_image_ds_loc) [ 637.531317] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 637.533823] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] vm_util.copy_virtual_disk( [ 637.533823] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 637.533823] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] session._wait_for_task(vmdk_copy_task) [ 637.533823] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 637.533823] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] return self.wait_for_task(task_ref) [ 637.533823] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 637.533823] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] return evt.wait() [ 637.533823] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 637.533823] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] result = hub.switch() [ 637.533823] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 637.533823] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] return self.greenlet.switch() [ 637.533823] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 637.533823] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] self.f(*self.args, **self.kw) [ 637.534698] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 637.534698] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] raise exceptions.translate_fault(task_info.error) [ 637.534698] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 637.534698] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Faults: ['InvalidArgument'] [ 637.534698] env[60722]: ERROR nova.compute.manager [instance: a2ad54e2-c14a-4548-aedd-01668745b397] [ 637.534698] env[60722]: DEBUG nova.compute.utils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] VimFaultException {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 637.534698] env[60722]: DEBUG nova.compute.manager [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Build of instance a2ad54e2-c14a-4548-aedd-01668745b397 was re-scheduled: A specified parameter was not correct: fileType [ 637.534698] env[60722]: Faults: ['InvalidArgument'] {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 637.534698] env[60722]: DEBUG nova.compute.manager [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 637.534978] env[60722]: DEBUG nova.compute.manager [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 637.534978] env[60722]: DEBUG nova.compute.manager [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 637.535057] env[60722]: DEBUG nova.network.neutron [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 639.235152] env[60722]: DEBUG nova.network.neutron [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 639.247174] env[60722]: INFO nova.compute.manager [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] Took 1.71 seconds to deallocate network for instance. [ 639.370037] env[60722]: INFO nova.scheduler.client.report [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Deleted allocations for instance a2ad54e2-c14a-4548-aedd-01668745b397 [ 639.398412] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6650f2c1-4ea0-4454-a7c1-21fe29cb8caa tempest-TenantUsagesTestJSON-565169773 tempest-TenantUsagesTestJSON-565169773-project-member] Lock "a2ad54e2-c14a-4548-aedd-01668745b397" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 61.429s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 639.398661] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "a2ad54e2-c14a-4548-aedd-01668745b397" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 47.218s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 639.398842] env[60722]: INFO nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: a2ad54e2-c14a-4548-aedd-01668745b397] During sync_power_state the instance has a pending task (spawning). Skip. [ 639.399018] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "a2ad54e2-c14a-4548-aedd-01668745b397" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 654.624549] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 654.624789] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 654.658261] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 654.658261] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 654.658261] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 654.658261] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60722) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 654.945997] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 655.944969] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 655.944969] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Starting heal instance info cache {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 655.944969] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Rebuilding the list of instances to heal {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 655.972140] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 655.972140] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 655.972140] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 655.972140] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 655.972140] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 655.972284] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 655.972284] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 655.972402] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 655.972516] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 655.972631] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Didn't find any instances for network info cache update. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 655.973586] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 655.973586] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 655.973771] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 655.986039] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 655.986039] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.986039] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 655.986039] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60722) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 655.986039] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f74dd87-ff67-42ba-9dc7-f6b7844cbc19 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.998614] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd9122fc-0312-4acb-8294-436b0871fb42 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.024871] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2084b962-4e5e-4ba1-8edd-6d54e6163fcc {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.033013] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa3bdaaa-0e41-4b0b-a09d-4e9b7837ec1f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.084979] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181685MB free_disk=100GB free_vcpus=48 pci_devices=None {{(pid=60722) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 656.086304] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 656.086553] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 656.167107] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 42da538f-82b8-4c91-93e3-1dc84a2eabda actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 656.167107] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance d1623803-5152-47d0-b1a7-5e8d4ab06233 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 656.167107] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance d09f5c24-d76b-4ff9-acdd-8da94d70f9cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 656.167232] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 65901b4a-42cf-4795-abc2-b0fea1f4fee7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 656.167328] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 1b18a8e4-eab9-4f28-bd87-a354c436b51c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 656.167439] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance bfde3558-9940-4402-bdf9-15c23c285a8f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 656.167579] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance bc2a1e45-2f48-4a73-bfee-69a20725a610 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 656.167955] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance e93b8d4b-6286-410a-870a-02fa7e59d90d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 656.167955] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 93268011-e1f2-4041-b4df-473c06d3f1eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 656.167955] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 656.170717] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1728MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 656.351848] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f246461e-84ab-45e1-8073-78b022db5cbe {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.365183] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98db4685-f660-475d-9a41-b20fa447183b {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.398889] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad42d919-89f4-4a35-877a-2b92864efdc0 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.406797] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bca2689-2209-49d0-979f-6d58e4be3082 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.422111] env[60722]: DEBUG nova.compute.provider_tree [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 656.432354] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 656.449416] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60722) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 656.449599] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.363s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 666.743098] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Acquiring lock "eae8d9ce-9fe3-411e-9fd8-05920fb0af04" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 666.744084] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Lock "eae8d9ce-9fe3-411e-9fd8-05920fb0af04" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 666.755357] env[60722]: DEBUG nova.compute.manager [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 666.816183] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 666.816419] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 666.817940] env[60722]: INFO nova.compute.claims [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 667.024417] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04dfbd08-8e0b-4224-8ef0-6b946134787a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.032885] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-295fb2fe-c45b-435a-a418-49bdda26e6c1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.065753] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c85544c2-48b0-4f88-b860-cac261c94cab {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.073285] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5c7f82c-ba1a-4612-8262-289bf44dcdea {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.087536] env[60722]: DEBUG nova.compute.provider_tree [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 667.097616] env[60722]: DEBUG nova.scheduler.client.report [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 667.111344] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.295s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 667.111832] env[60722]: DEBUG nova.compute.manager [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 667.143440] env[60722]: DEBUG nova.compute.utils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 667.145105] env[60722]: DEBUG nova.compute.manager [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 667.145279] env[60722]: DEBUG nova.network.neutron [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 667.168118] env[60722]: DEBUG nova.compute.manager [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 667.237376] env[60722]: DEBUG nova.compute.manager [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 667.263018] env[60722]: DEBUG nova.virt.hardware [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 667.264123] env[60722]: DEBUG nova.virt.hardware [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 667.264123] env[60722]: DEBUG nova.virt.hardware [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 667.264123] env[60722]: DEBUG nova.virt.hardware [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 667.264123] env[60722]: DEBUG nova.virt.hardware [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 667.264123] env[60722]: DEBUG nova.virt.hardware [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 667.264261] env[60722]: DEBUG nova.virt.hardware [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 667.264444] env[60722]: DEBUG nova.virt.hardware [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 667.264649] env[60722]: DEBUG nova.virt.hardware [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 667.265976] env[60722]: DEBUG nova.virt.hardware [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 667.265976] env[60722]: DEBUG nova.virt.hardware [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 667.266159] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3be8ea9f-d1d3-48ca-87a7-6c27539993f5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.275022] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51db79bf-6cb0-4b12-b8d1-50a2fc65ebf7 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.553431] env[60722]: DEBUG nova.policy [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bff4b88bd96242a68e1bcc934916ad93', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c2102dc908494099be76f3967ac8876b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 669.705706] env[60722]: DEBUG nova.network.neutron [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Successfully created port: 5c9fb706-c952-40fe-9e82-4d1ababaeea1 {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 673.500728] env[60722]: DEBUG nova.network.neutron [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Successfully updated port: 5c9fb706-c952-40fe-9e82-4d1ababaeea1 {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 673.511826] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Acquiring lock "refresh_cache-eae8d9ce-9fe3-411e-9fd8-05920fb0af04" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 673.515490] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Acquired lock "refresh_cache-eae8d9ce-9fe3-411e-9fd8-05920fb0af04" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 673.515490] env[60722]: DEBUG nova.network.neutron [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 673.630138] env[60722]: DEBUG nova.network.neutron [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 674.500279] env[60722]: DEBUG nova.network.neutron [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Updating instance_info_cache with network_info: [{"id": "5c9fb706-c952-40fe-9e82-4d1ababaeea1", "address": "fa:16:3e:94:b0:21", "network": {"id": "efcf3064-39e2-4afb-bdf8-76e21f111d46", "bridge": "br-int", "label": "tempest-ServersTestJSON-1375104337-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c2102dc908494099be76f3967ac8876b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "21310d90-efbc-45a8-a97f-c4358606530f", "external-id": "nsx-vlan-transportzone-672", "segmentation_id": 672, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c9fb706-c9", "ovs_interfaceid": "5c9fb706-c952-40fe-9e82-4d1ababaeea1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 674.509412] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Releasing lock "refresh_cache-eae8d9ce-9fe3-411e-9fd8-05920fb0af04" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 674.510120] env[60722]: DEBUG nova.compute.manager [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Instance network_info: |[{"id": "5c9fb706-c952-40fe-9e82-4d1ababaeea1", "address": "fa:16:3e:94:b0:21", "network": {"id": "efcf3064-39e2-4afb-bdf8-76e21f111d46", "bridge": "br-int", "label": "tempest-ServersTestJSON-1375104337-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c2102dc908494099be76f3967ac8876b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "21310d90-efbc-45a8-a97f-c4358606530f", "external-id": "nsx-vlan-transportzone-672", "segmentation_id": 672, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c9fb706-c9", "ovs_interfaceid": "5c9fb706-c952-40fe-9e82-4d1ababaeea1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 674.510950] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:94:b0:21', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '21310d90-efbc-45a8-a97f-c4358606530f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5c9fb706-c952-40fe-9e82-4d1ababaeea1', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 674.519766] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Creating folder: Project (c2102dc908494099be76f3967ac8876b). Parent ref: group-v141606. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 674.519766] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f6e21520-8728-473d-83d2-b2c0bdf1b04d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 674.530856] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Created folder: Project (c2102dc908494099be76f3967ac8876b) in parent group-v141606. [ 674.531304] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Creating folder: Instances. Parent ref: group-v141634. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 674.531773] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4e835de0-ce42-44e5-9d46-99740a854aa4 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 674.540613] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Created folder: Instances in parent group-v141634. [ 674.541491] env[60722]: DEBUG oslo.service.loopingcall [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 674.541788] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 674.542103] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5018d504-9b6e-4a99-be34-23c3798f4b3d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 674.563173] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 674.563173] env[60722]: value = "task-565148" [ 674.563173] env[60722]: _type = "Task" [ 674.563173] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 674.571382] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565148, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 674.680117] env[60722]: DEBUG nova.compute.manager [req-45023d62-e4b1-4946-8861-82de3a2dfe55 req-0f69ae4d-e180-4c93-a1d0-c54cbc871c05 service nova] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Received event network-vif-plugged-5c9fb706-c952-40fe-9e82-4d1ababaeea1 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 674.680371] env[60722]: DEBUG oslo_concurrency.lockutils [req-45023d62-e4b1-4946-8861-82de3a2dfe55 req-0f69ae4d-e180-4c93-a1d0-c54cbc871c05 service nova] Acquiring lock "eae8d9ce-9fe3-411e-9fd8-05920fb0af04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 674.680613] env[60722]: DEBUG oslo_concurrency.lockutils [req-45023d62-e4b1-4946-8861-82de3a2dfe55 req-0f69ae4d-e180-4c93-a1d0-c54cbc871c05 service nova] Lock "eae8d9ce-9fe3-411e-9fd8-05920fb0af04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 674.680827] env[60722]: DEBUG oslo_concurrency.lockutils [req-45023d62-e4b1-4946-8861-82de3a2dfe55 req-0f69ae4d-e180-4c93-a1d0-c54cbc871c05 service nova] Lock "eae8d9ce-9fe3-411e-9fd8-05920fb0af04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 674.680998] env[60722]: DEBUG nova.compute.manager [req-45023d62-e4b1-4946-8861-82de3a2dfe55 req-0f69ae4d-e180-4c93-a1d0-c54cbc871c05 service nova] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] No waiting events found dispatching network-vif-plugged-5c9fb706-c952-40fe-9e82-4d1ababaeea1 {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 674.681629] env[60722]: WARNING nova.compute.manager [req-45023d62-e4b1-4946-8861-82de3a2dfe55 req-0f69ae4d-e180-4c93-a1d0-c54cbc871c05 service nova] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Received unexpected event network-vif-plugged-5c9fb706-c952-40fe-9e82-4d1ababaeea1 for instance with vm_state building and task_state spawning. [ 675.075404] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565148, 'name': CreateVM_Task, 'duration_secs': 0.293483} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 675.075585] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 675.076263] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 675.076425] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 675.076754] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 675.076946] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bad72cca-2df9-4ddf-a05b-0d4624a10adb {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 675.082260] env[60722]: DEBUG oslo_vmware.api [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Waiting for the task: (returnval){ [ 675.082260] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52cdec5c-b139-d649-812d-091b975c22d8" [ 675.082260] env[60722]: _type = "Task" [ 675.082260] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 675.091404] env[60722]: DEBUG oslo_vmware.api [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52cdec5c-b139-d649-812d-091b975c22d8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 675.596065] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 675.596674] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 675.596674] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 677.368636] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Acquiring lock "4e66f1dc-18c6-4d64-9bbe-9b061e795a65" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 677.368636] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Lock "4e66f1dc-18c6-4d64-9bbe-9b061e795a65" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 678.521112] env[60722]: DEBUG nova.compute.manager [req-fcbdd6cf-d568-4339-a256-d7597e700024 req-e4a394ca-78f4-4102-a637-15267cd32558 service nova] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Received event network-changed-5c9fb706-c952-40fe-9e82-4d1ababaeea1 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 678.521384] env[60722]: DEBUG nova.compute.manager [req-fcbdd6cf-d568-4339-a256-d7597e700024 req-e4a394ca-78f4-4102-a637-15267cd32558 service nova] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Refreshing instance network info cache due to event network-changed-5c9fb706-c952-40fe-9e82-4d1ababaeea1. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 678.521515] env[60722]: DEBUG oslo_concurrency.lockutils [req-fcbdd6cf-d568-4339-a256-d7597e700024 req-e4a394ca-78f4-4102-a637-15267cd32558 service nova] Acquiring lock "refresh_cache-eae8d9ce-9fe3-411e-9fd8-05920fb0af04" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 678.522149] env[60722]: DEBUG oslo_concurrency.lockutils [req-fcbdd6cf-d568-4339-a256-d7597e700024 req-e4a394ca-78f4-4102-a637-15267cd32558 service nova] Acquired lock "refresh_cache-eae8d9ce-9fe3-411e-9fd8-05920fb0af04" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 678.522605] env[60722]: DEBUG nova.network.neutron [req-fcbdd6cf-d568-4339-a256-d7597e700024 req-e4a394ca-78f4-4102-a637-15267cd32558 service nova] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Refreshing network info cache for port 5c9fb706-c952-40fe-9e82-4d1ababaeea1 {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 680.052199] env[60722]: DEBUG nova.network.neutron [req-fcbdd6cf-d568-4339-a256-d7597e700024 req-e4a394ca-78f4-4102-a637-15267cd32558 service nova] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Updated VIF entry in instance network info cache for port 5c9fb706-c952-40fe-9e82-4d1ababaeea1. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 680.052199] env[60722]: DEBUG nova.network.neutron [req-fcbdd6cf-d568-4339-a256-d7597e700024 req-e4a394ca-78f4-4102-a637-15267cd32558 service nova] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Updating instance_info_cache with network_info: [{"id": "5c9fb706-c952-40fe-9e82-4d1ababaeea1", "address": "fa:16:3e:94:b0:21", "network": {"id": "efcf3064-39e2-4afb-bdf8-76e21f111d46", "bridge": "br-int", "label": "tempest-ServersTestJSON-1375104337-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c2102dc908494099be76f3967ac8876b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "21310d90-efbc-45a8-a97f-c4358606530f", "external-id": "nsx-vlan-transportzone-672", "segmentation_id": 672, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c9fb706-c9", "ovs_interfaceid": "5c9fb706-c952-40fe-9e82-4d1ababaeea1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 680.072508] env[60722]: DEBUG oslo_concurrency.lockutils [req-fcbdd6cf-d568-4339-a256-d7597e700024 req-e4a394ca-78f4-4102-a637-15267cd32558 service nova] Releasing lock "refresh_cache-eae8d9ce-9fe3-411e-9fd8-05920fb0af04" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 681.410520] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Acquiring lock "22463917-2185-42f7-87b7-2b720be45c22" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.411518] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Lock "22463917-2185-42f7-87b7-2b720be45c22" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.663985] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Acquiring lock "1c4b8597-88ec-4e79-a749-f802803a5ffe" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.664553] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Lock "1c4b8597-88ec-4e79-a749-f802803a5ffe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.852813] env[60722]: WARNING oslo_vmware.rw_handles [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 682.852813] env[60722]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 682.852813] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 682.852813] env[60722]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 682.852813] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 682.852813] env[60722]: ERROR oslo_vmware.rw_handles response.begin() [ 682.852813] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 682.852813] env[60722]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 682.852813] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 682.852813] env[60722]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 682.852813] env[60722]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 682.852813] env[60722]: ERROR oslo_vmware.rw_handles [ 682.853578] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Downloaded image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to vmware_temp/3a5e3f43-c1e6-45fb-9b83-b14428720abb/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 682.854776] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Caching image {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 682.855017] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Copying Virtual Disk [datastore1] vmware_temp/3a5e3f43-c1e6-45fb-9b83-b14428720abb/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk to [datastore1] vmware_temp/3a5e3f43-c1e6-45fb-9b83-b14428720abb/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk {{(pid=60722) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 682.857258] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-74e85836-598a-41d2-9f5b-934a00af21b2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 682.867788] env[60722]: DEBUG oslo_vmware.api [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Waiting for the task: (returnval){ [ 682.867788] env[60722]: value = "task-565153" [ 682.867788] env[60722]: _type = "Task" [ 682.867788] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 682.875989] env[60722]: DEBUG oslo_vmware.api [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Task: {'id': task-565153, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 683.380664] env[60722]: DEBUG oslo_vmware.exceptions [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Fault InvalidArgument not matched. {{(pid=60722) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 683.380664] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 683.380664] env[60722]: ERROR nova.compute.manager [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 683.380664] env[60722]: Faults: ['InvalidArgument'] [ 683.380664] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Traceback (most recent call last): [ 683.380664] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 683.380664] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] yield resources [ 683.380664] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 683.380664] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] self.driver.spawn(context, instance, image_meta, [ 683.381015] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 683.381015] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] self._vmops.spawn(context, instance, image_meta, injected_files, [ 683.381015] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 683.381015] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] self._fetch_image_if_missing(context, vi) [ 683.381015] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 683.381015] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] image_cache(vi, tmp_image_ds_loc) [ 683.381015] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 683.381015] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] vm_util.copy_virtual_disk( [ 683.381015] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 683.381015] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] session._wait_for_task(vmdk_copy_task) [ 683.381015] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 683.381015] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] return self.wait_for_task(task_ref) [ 683.381015] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 683.381331] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] return evt.wait() [ 683.381331] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 683.381331] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] result = hub.switch() [ 683.381331] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 683.381331] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] return self.greenlet.switch() [ 683.381331] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 683.381331] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] self.f(*self.args, **self.kw) [ 683.381331] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 683.381331] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] raise exceptions.translate_fault(task_info.error) [ 683.381331] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 683.381331] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Faults: ['InvalidArgument'] [ 683.381331] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] [ 683.381331] env[60722]: INFO nova.compute.manager [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Terminating instance [ 683.383672] env[60722]: DEBUG nova.compute.manager [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 683.383672] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 683.383672] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 683.383672] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 683.385686] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a300fe6a-78ea-4a2e-9046-5d762c508764 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.387548] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c174d71f-f710-4773-9bbd-b26db8e4c921 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.393751] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 683.393751] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-fef11f56-d06b-438b-a7e0-75744decf471 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.396893] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 683.397138] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 683.398166] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bb3328e4-d078-49ba-88b1-aa47f585c5cb {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.403394] env[60722]: DEBUG oslo_vmware.api [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Waiting for the task: (returnval){ [ 683.403394] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52457dbf-a308-681e-bc18-02e6b54aab7a" [ 683.403394] env[60722]: _type = "Task" [ 683.403394] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 683.410746] env[60722]: DEBUG oslo_vmware.api [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52457dbf-a308-681e-bc18-02e6b54aab7a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 683.478047] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 683.478345] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 683.480910] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Deleting the datastore file [datastore1] 42da538f-82b8-4c91-93e3-1dc84a2eabda {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 683.480910] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9ccf2707-849d-437d-9c4f-a7a5b241a29f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.485067] env[60722]: DEBUG oslo_vmware.api [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Waiting for the task: (returnval){ [ 683.485067] env[60722]: value = "task-565156" [ 683.485067] env[60722]: _type = "Task" [ 683.485067] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 683.495071] env[60722]: DEBUG oslo_vmware.api [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Task: {'id': task-565156, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 683.916500] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 683.916779] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Creating directory with path [datastore1] vmware_temp/4e5eb705-ab90-4a3f-9afd-0b0a79997a2c/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 683.916960] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a08c2be3-15fb-4791-934d-2dc7c0eb81a4 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.930251] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Created directory with path [datastore1] vmware_temp/4e5eb705-ab90-4a3f-9afd-0b0a79997a2c/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 683.930251] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Fetch image to [datastore1] vmware_temp/4e5eb705-ab90-4a3f-9afd-0b0a79997a2c/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 683.930370] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/4e5eb705-ab90-4a3f-9afd-0b0a79997a2c/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 683.931171] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f28a6cae-5aeb-4345-863d-c4ef0b20c5fd {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.938110] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cafc0a2-0872-48e1-bfdc-d2815fdd4745 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.948137] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ca4d0cd-4511-4b9a-814c-bf7f700d8314 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.983015] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a97ffe9-ab91-429c-9d89-049c48f6b54f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.995211] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-dbb67e6f-c20a-4671-908b-ce06c1b9e566 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 683.997313] env[60722]: DEBUG oslo_vmware.api [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Task: {'id': task-565156, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069961} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 683.997643] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 683.997719] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 683.997891] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 683.998126] env[60722]: INFO nova.compute.manager [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Took 0.62 seconds to destroy the instance on the hypervisor. [ 684.000186] env[60722]: DEBUG nova.compute.claims [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 684.000535] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 684.000535] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 684.018427] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 684.082247] env[60722]: DEBUG oslo_vmware.rw_handles [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4e5eb705-ab90-4a3f-9afd-0b0a79997a2c/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 684.141144] env[60722]: DEBUG oslo_vmware.rw_handles [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Completed reading data from the image iterator. {{(pid=60722) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 684.141330] env[60722]: DEBUG oslo_vmware.rw_handles [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4e5eb705-ab90-4a3f-9afd-0b0a79997a2c/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 684.326574] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26ca86da-fec6-4829-be3a-1292dc6c85e1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 684.334412] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9391ed5-4b33-4617-bb09-c56597eaa6b0 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 684.367492] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ece6c463-719b-493c-b055-4af80ad051f5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 684.374694] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c2973c7-1a1d-4b26-82c6-600d8ecb3735 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 684.387642] env[60722]: DEBUG nova.compute.provider_tree [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 684.395709] env[60722]: DEBUG nova.scheduler.client.report [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 684.415820] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.415s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 684.416604] env[60722]: ERROR nova.compute.manager [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 684.416604] env[60722]: Faults: ['InvalidArgument'] [ 684.416604] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Traceback (most recent call last): [ 684.416604] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 684.416604] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] self.driver.spawn(context, instance, image_meta, [ 684.416604] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 684.416604] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] self._vmops.spawn(context, instance, image_meta, injected_files, [ 684.416604] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 684.416604] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] self._fetch_image_if_missing(context, vi) [ 684.416604] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 684.416604] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] image_cache(vi, tmp_image_ds_loc) [ 684.416604] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 684.416965] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] vm_util.copy_virtual_disk( [ 684.416965] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 684.416965] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] session._wait_for_task(vmdk_copy_task) [ 684.416965] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 684.416965] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] return self.wait_for_task(task_ref) [ 684.416965] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 684.416965] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] return evt.wait() [ 684.416965] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 684.416965] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] result = hub.switch() [ 684.416965] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 684.416965] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] return self.greenlet.switch() [ 684.416965] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 684.416965] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] self.f(*self.args, **self.kw) [ 684.417296] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 684.417296] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] raise exceptions.translate_fault(task_info.error) [ 684.417296] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 684.417296] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Faults: ['InvalidArgument'] [ 684.417296] env[60722]: ERROR nova.compute.manager [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] [ 684.417559] env[60722]: DEBUG nova.compute.utils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] VimFaultException {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 684.419795] env[60722]: DEBUG nova.compute.manager [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Build of instance 42da538f-82b8-4c91-93e3-1dc84a2eabda was re-scheduled: A specified parameter was not correct: fileType [ 684.419795] env[60722]: Faults: ['InvalidArgument'] {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 684.420318] env[60722]: DEBUG nova.compute.manager [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 684.420787] env[60722]: DEBUG nova.compute.manager [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 684.420787] env[60722]: DEBUG nova.compute.manager [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 684.420933] env[60722]: DEBUG nova.network.neutron [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 684.965082] env[60722]: DEBUG nova.network.neutron [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 684.979294] env[60722]: INFO nova.compute.manager [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] Took 0.56 seconds to deallocate network for instance. [ 685.029938] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Acquiring lock "2786801d-6211-4598-b357-4f0a0ffdd7d1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 685.030399] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Lock "2786801d-6211-4598-b357-4f0a0ffdd7d1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 685.086316] env[60722]: INFO nova.scheduler.client.report [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Deleted allocations for instance 42da538f-82b8-4c91-93e3-1dc84a2eabda [ 685.106034] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2478b896-bd4a-4c83-b91e-2495107413aa tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Lock "42da538f-82b8-4c91-93e3-1dc84a2eabda" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 101.894s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 685.107082] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "42da538f-82b8-4c91-93e3-1dc84a2eabda" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 92.926s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 685.108190] env[60722]: INFO nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 42da538f-82b8-4c91-93e3-1dc84a2eabda] During sync_power_state the instance has a pending task (spawning). Skip. [ 685.108190] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "42da538f-82b8-4c91-93e3-1dc84a2eabda" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 685.128362] env[60722]: DEBUG nova.compute.manager [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 685.203711] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 685.204874] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 685.205823] env[60722]: INFO nova.compute.claims [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 685.478703] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7ab308f-f626-4586-8394-e1d685670f5c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.493275] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34523ffb-df2d-45eb-92bf-1fbe73579237 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.531537] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18359cad-d559-4485-9881-ead18e37e14c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.539608] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-561bd227-ada6-4581-8d50-dc5f05550303 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.553800] env[60722]: DEBUG nova.compute.provider_tree [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 685.566020] env[60722]: DEBUG nova.scheduler.client.report [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 685.589030] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.383s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 685.589030] env[60722]: DEBUG nova.compute.manager [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 685.633104] env[60722]: DEBUG nova.compute.utils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 685.633893] env[60722]: DEBUG nova.compute.manager [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 685.633975] env[60722]: DEBUG nova.network.neutron [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 685.655021] env[60722]: DEBUG nova.compute.manager [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 685.742335] env[60722]: DEBUG nova.compute.manager [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 685.768464] env[60722]: DEBUG nova.virt.hardware [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 685.768695] env[60722]: DEBUG nova.virt.hardware [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 685.768847] env[60722]: DEBUG nova.virt.hardware [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 685.769038] env[60722]: DEBUG nova.virt.hardware [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 685.769544] env[60722]: DEBUG nova.virt.hardware [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 685.770493] env[60722]: DEBUG nova.virt.hardware [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 685.771042] env[60722]: DEBUG nova.virt.hardware [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 685.771153] env[60722]: DEBUG nova.virt.hardware [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 685.772451] env[60722]: DEBUG nova.virt.hardware [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 685.772746] env[60722]: DEBUG nova.virt.hardware [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 685.773125] env[60722]: DEBUG nova.virt.hardware [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 685.774304] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c60f1c2-7bf4-4993-b2f3-180eee2eb008 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.784590] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-276f740c-6349-4a7f-a595-d578d513ca48 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 685.790178] env[60722]: DEBUG nova.policy [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f204cfc714d2422791d3359a2ab6117a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ae410e8bc03c4b96a8c2767feeabcdc1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 685.969044] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Acquiring lock "019db29d-b8e4-4592-b7c4-2c044e2b2a51" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 685.969326] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Lock "019db29d-b8e4-4592-b7c4-2c044e2b2a51" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 686.652270] env[60722]: DEBUG nova.network.neutron [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Successfully created port: b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 687.268014] env[60722]: DEBUG nova.network.neutron [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Successfully created port: 27afab40-1b84-4089-973a-32c8164be535 {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 688.396432] env[60722]: DEBUG nova.network.neutron [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Successfully updated port: b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 688.514935] env[60722]: DEBUG oslo_concurrency.lockutils [None req-9f364c40-2870-4552-952f-1588106b934b tempest-SecurityGroupsTestJSON-205882840 tempest-SecurityGroupsTestJSON-205882840-project-member] Acquiring lock "151df220-ca11-4455-b620-f8fe5a1be5b7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.515210] env[60722]: DEBUG oslo_concurrency.lockutils [None req-9f364c40-2870-4552-952f-1588106b934b tempest-SecurityGroupsTestJSON-205882840 tempest-SecurityGroupsTestJSON-205882840-project-member] Lock "151df220-ca11-4455-b620-f8fe5a1be5b7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.864220] env[60722]: DEBUG oslo_concurrency.lockutils [None req-51fd68c0-ea71-4901-8fd0-76820c3d7c52 tempest-ServerPasswordTestJSON-1030397772 tempest-ServerPasswordTestJSON-1030397772-project-member] Acquiring lock "47f34953-a5e3-4b5e-9164-ec5980802298" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.864465] env[60722]: DEBUG oslo_concurrency.lockutils [None req-51fd68c0-ea71-4901-8fd0-76820c3d7c52 tempest-ServerPasswordTestJSON-1030397772 tempest-ServerPasswordTestJSON-1030397772-project-member] Lock "47f34953-a5e3-4b5e-9164-ec5980802298" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 689.233563] env[60722]: DEBUG nova.network.neutron [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Successfully updated port: 27afab40-1b84-4089-973a-32c8164be535 {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 689.243685] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Acquiring lock "refresh_cache-4e66f1dc-18c6-4d64-9bbe-9b061e795a65" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 689.243819] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Acquired lock "refresh_cache-4e66f1dc-18c6-4d64-9bbe-9b061e795a65" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 689.243970] env[60722]: DEBUG nova.network.neutron [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 689.329619] env[60722]: DEBUG nova.network.neutron [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 689.650014] env[60722]: DEBUG nova.compute.manager [req-0b03cad1-0196-449e-9989-8e1f56ec3471 req-c7e03791-4c3d-4a43-88bd-4c0f83d41342 service nova] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Received event network-vif-plugged-b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 689.650014] env[60722]: DEBUG oslo_concurrency.lockutils [req-0b03cad1-0196-449e-9989-8e1f56ec3471 req-c7e03791-4c3d-4a43-88bd-4c0f83d41342 service nova] Acquiring lock "4e66f1dc-18c6-4d64-9bbe-9b061e795a65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 689.650014] env[60722]: DEBUG oslo_concurrency.lockutils [req-0b03cad1-0196-449e-9989-8e1f56ec3471 req-c7e03791-4c3d-4a43-88bd-4c0f83d41342 service nova] Lock "4e66f1dc-18c6-4d64-9bbe-9b061e795a65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 689.650014] env[60722]: DEBUG oslo_concurrency.lockutils [req-0b03cad1-0196-449e-9989-8e1f56ec3471 req-c7e03791-4c3d-4a43-88bd-4c0f83d41342 service nova] Lock "4e66f1dc-18c6-4d64-9bbe-9b061e795a65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 689.651116] env[60722]: DEBUG nova.compute.manager [req-0b03cad1-0196-449e-9989-8e1f56ec3471 req-c7e03791-4c3d-4a43-88bd-4c0f83d41342 service nova] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] No waiting events found dispatching network-vif-plugged-b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 689.651432] env[60722]: WARNING nova.compute.manager [req-0b03cad1-0196-449e-9989-8e1f56ec3471 req-c7e03791-4c3d-4a43-88bd-4c0f83d41342 service nova] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Received unexpected event network-vif-plugged-b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d for instance with vm_state building and task_state spawning. [ 689.693019] env[60722]: DEBUG oslo_concurrency.lockutils [None req-dfc281b1-df48-410b-a8dc-dafe3b200ee8 tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Acquiring lock "7f1d5c92-ea40-4ad1-b669-877028b69711" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 689.693019] env[60722]: DEBUG oslo_concurrency.lockutils [None req-dfc281b1-df48-410b-a8dc-dafe3b200ee8 tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Lock "7f1d5c92-ea40-4ad1-b669-877028b69711" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.057165] env[60722]: DEBUG nova.network.neutron [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Updating instance_info_cache with network_info: [{"id": "b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d", "address": "fa:16:3e:2d:ef:60", "network": {"id": "c1fdef28-2b1a-4bc1-a53a-17e8471a7316", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-624169681", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ae410e8bc03c4b96a8c2767feeabcdc1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1323cb03-8367-485a-962e-131af8eba474", "external-id": "nsx-vlan-transportzone-41", "segmentation_id": 41, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb47f7eca-cc", "ovs_interfaceid": "b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "27afab40-1b84-4089-973a-32c8164be535", "address": "fa:16:3e:5a:a2:73", "network": {"id": "4b5cc2f0-f720-4dc4-886d-e5fb60fd5730", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-502526814", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "ae410e8bc03c4b96a8c2767feeabcdc1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d646f9d5-d2ad-4c22-bea5-85a965334de6", "external-id": "nsx-vlan-transportzone-606", "segmentation_id": 606, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap27afab40-1b", "ovs_interfaceid": "27afab40-1b84-4089-973a-32c8164be535", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 690.073843] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Releasing lock "refresh_cache-4e66f1dc-18c6-4d64-9bbe-9b061e795a65" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 690.074857] env[60722]: DEBUG nova.compute.manager [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Instance network_info: |[{"id": "b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d", "address": "fa:16:3e:2d:ef:60", "network": {"id": "c1fdef28-2b1a-4bc1-a53a-17e8471a7316", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-624169681", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ae410e8bc03c4b96a8c2767feeabcdc1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1323cb03-8367-485a-962e-131af8eba474", "external-id": "nsx-vlan-transportzone-41", "segmentation_id": 41, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb47f7eca-cc", "ovs_interfaceid": "b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "27afab40-1b84-4089-973a-32c8164be535", "address": "fa:16:3e:5a:a2:73", "network": {"id": "4b5cc2f0-f720-4dc4-886d-e5fb60fd5730", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-502526814", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "ae410e8bc03c4b96a8c2767feeabcdc1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d646f9d5-d2ad-4c22-bea5-85a965334de6", "external-id": "nsx-vlan-transportzone-606", "segmentation_id": 606, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap27afab40-1b", "ovs_interfaceid": "27afab40-1b84-4089-973a-32c8164be535", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 690.074857] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2d:ef:60', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1323cb03-8367-485a-962e-131af8eba474', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:5a:a2:73', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd646f9d5-d2ad-4c22-bea5-85a965334de6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '27afab40-1b84-4089-973a-32c8164be535', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 690.087211] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Creating folder: Project (ae410e8bc03c4b96a8c2767feeabcdc1). Parent ref: group-v141606. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 690.087211] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f520e6de-cd49-4ae0-8d9d-30fc03ee6cc2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.096895] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2f879def-e0e0-4041-95c5-ecdafea31b68 tempest-ServersTestBootFromVolume-2075344351 tempest-ServersTestBootFromVolume-2075344351-project-member] Acquiring lock "ecade932-ccf5-4a5c-8348-6d88a311f3a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.097198] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2f879def-e0e0-4041-95c5-ecdafea31b68 tempest-ServersTestBootFromVolume-2075344351 tempest-ServersTestBootFromVolume-2075344351-project-member] Lock "ecade932-ccf5-4a5c-8348-6d88a311f3a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.102226] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Created folder: Project (ae410e8bc03c4b96a8c2767feeabcdc1) in parent group-v141606. [ 690.102405] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Creating folder: Instances. Parent ref: group-v141638. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 690.102635] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3ce8f87b-7ee3-4c50-83ca-64456bfb43c3 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.128107] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Created folder: Instances in parent group-v141638. [ 690.128370] env[60722]: DEBUG oslo.service.loopingcall [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 690.128556] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 690.128754] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9e519ed4-a266-4da2-96c6-fc8b72ac8d65 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.157424] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 690.157424] env[60722]: value = "task-565160" [ 690.157424] env[60722]: _type = "Task" [ 690.157424] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 690.171721] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565160, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 690.668021] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565160, 'name': CreateVM_Task, 'duration_secs': 0.340821} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 690.668405] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 690.669546] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 690.669706] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 690.670035] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 690.670312] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8870cbce-2822-45cf-bf05-ea5872c5aba9 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.676281] env[60722]: DEBUG oslo_vmware.api [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Waiting for the task: (returnval){ [ 690.676281] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]522f8bc5-9ca1-338d-85a4-169bb14c89c2" [ 690.676281] env[60722]: _type = "Task" [ 690.676281] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 690.685806] env[60722]: DEBUG oslo_vmware.api [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]522f8bc5-9ca1-338d-85a4-169bb14c89c2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 691.188562] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 691.188845] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 691.189063] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 691.552659] env[60722]: DEBUG oslo_concurrency.lockutils [None req-8d7183c9-1482-4f24-8e20-e57843b3ab2c tempest-ServersV294TestFqdnHostnames-1444459050 tempest-ServersV294TestFqdnHostnames-1444459050-project-member] Acquiring lock "f60a4fc6-1a93-4f54-8b34-71aa0a2e035c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.552924] env[60722]: DEBUG oslo_concurrency.lockutils [None req-8d7183c9-1482-4f24-8e20-e57843b3ab2c tempest-ServersV294TestFqdnHostnames-1444459050 tempest-ServersV294TestFqdnHostnames-1444459050-project-member] Lock "f60a4fc6-1a93-4f54-8b34-71aa0a2e035c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.895134] env[60722]: DEBUG nova.compute.manager [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Received event network-changed-b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 691.895453] env[60722]: DEBUG nova.compute.manager [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Refreshing instance network info cache due to event network-changed-b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 691.895580] env[60722]: DEBUG oslo_concurrency.lockutils [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] Acquiring lock "refresh_cache-4e66f1dc-18c6-4d64-9bbe-9b061e795a65" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 691.895643] env[60722]: DEBUG oslo_concurrency.lockutils [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] Acquired lock "refresh_cache-4e66f1dc-18c6-4d64-9bbe-9b061e795a65" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 691.895826] env[60722]: DEBUG nova.network.neutron [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Refreshing network info cache for port b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 692.075499] env[60722]: DEBUG oslo_concurrency.lockutils [None req-7b0464b1-c34a-4a7b-a452-3247651f91c9 tempest-AttachVolumeNegativeTest-1992155444 tempest-AttachVolumeNegativeTest-1992155444-project-member] Acquiring lock "22eaddb2-108d-44e4-8d9d-85fc91efa9f9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 692.075769] env[60722]: DEBUG oslo_concurrency.lockutils [None req-7b0464b1-c34a-4a7b-a452-3247651f91c9 tempest-AttachVolumeNegativeTest-1992155444 tempest-AttachVolumeNegativeTest-1992155444-project-member] Lock "22eaddb2-108d-44e4-8d9d-85fc91efa9f9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 692.254937] env[60722]: DEBUG nova.network.neutron [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Updated VIF entry in instance network info cache for port b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 692.255426] env[60722]: DEBUG nova.network.neutron [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Updating instance_info_cache with network_info: [{"id": "b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d", "address": "fa:16:3e:2d:ef:60", "network": {"id": "c1fdef28-2b1a-4bc1-a53a-17e8471a7316", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-624169681", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ae410e8bc03c4b96a8c2767feeabcdc1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1323cb03-8367-485a-962e-131af8eba474", "external-id": "nsx-vlan-transportzone-41", "segmentation_id": 41, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb47f7eca-cc", "ovs_interfaceid": "b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "27afab40-1b84-4089-973a-32c8164be535", "address": "fa:16:3e:5a:a2:73", "network": {"id": "4b5cc2f0-f720-4dc4-886d-e5fb60fd5730", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-502526814", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "ae410e8bc03c4b96a8c2767feeabcdc1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d646f9d5-d2ad-4c22-bea5-85a965334de6", "external-id": "nsx-vlan-transportzone-606", "segmentation_id": 606, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap27afab40-1b", "ovs_interfaceid": "27afab40-1b84-4089-973a-32c8164be535", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 692.264848] env[60722]: DEBUG oslo_concurrency.lockutils [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] Releasing lock "refresh_cache-4e66f1dc-18c6-4d64-9bbe-9b061e795a65" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 692.265114] env[60722]: DEBUG nova.compute.manager [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Received event network-vif-plugged-27afab40-1b84-4089-973a-32c8164be535 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 692.265317] env[60722]: DEBUG oslo_concurrency.lockutils [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] Acquiring lock "4e66f1dc-18c6-4d64-9bbe-9b061e795a65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 692.265513] env[60722]: DEBUG oslo_concurrency.lockutils [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] Lock "4e66f1dc-18c6-4d64-9bbe-9b061e795a65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 692.265669] env[60722]: DEBUG oslo_concurrency.lockutils [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] Lock "4e66f1dc-18c6-4d64-9bbe-9b061e795a65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 692.265830] env[60722]: DEBUG nova.compute.manager [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] No waiting events found dispatching network-vif-plugged-27afab40-1b84-4089-973a-32c8164be535 {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 692.265989] env[60722]: WARNING nova.compute.manager [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Received unexpected event network-vif-plugged-27afab40-1b84-4089-973a-32c8164be535 for instance with vm_state building and task_state spawning. [ 692.266154] env[60722]: DEBUG nova.compute.manager [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Received event network-changed-27afab40-1b84-4089-973a-32c8164be535 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 692.266301] env[60722]: DEBUG nova.compute.manager [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Refreshing instance network info cache due to event network-changed-27afab40-1b84-4089-973a-32c8164be535. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 692.266473] env[60722]: DEBUG oslo_concurrency.lockutils [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] Acquiring lock "refresh_cache-4e66f1dc-18c6-4d64-9bbe-9b061e795a65" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 692.266601] env[60722]: DEBUG oslo_concurrency.lockutils [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] Acquired lock "refresh_cache-4e66f1dc-18c6-4d64-9bbe-9b061e795a65" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 692.266750] env[60722]: DEBUG nova.network.neutron [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Refreshing network info cache for port 27afab40-1b84-4089-973a-32c8164be535 {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 692.602051] env[60722]: DEBUG nova.network.neutron [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Updated VIF entry in instance network info cache for port 27afab40-1b84-4089-973a-32c8164be535. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 692.602482] env[60722]: DEBUG nova.network.neutron [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Updating instance_info_cache with network_info: [{"id": "b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d", "address": "fa:16:3e:2d:ef:60", "network": {"id": "c1fdef28-2b1a-4bc1-a53a-17e8471a7316", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-624169681", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ae410e8bc03c4b96a8c2767feeabcdc1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1323cb03-8367-485a-962e-131af8eba474", "external-id": "nsx-vlan-transportzone-41", "segmentation_id": 41, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb47f7eca-cc", "ovs_interfaceid": "b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "27afab40-1b84-4089-973a-32c8164be535", "address": "fa:16:3e:5a:a2:73", "network": {"id": "4b5cc2f0-f720-4dc4-886d-e5fb60fd5730", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-502526814", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.112", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "ae410e8bc03c4b96a8c2767feeabcdc1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d646f9d5-d2ad-4c22-bea5-85a965334de6", "external-id": "nsx-vlan-transportzone-606", "segmentation_id": 606, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap27afab40-1b", "ovs_interfaceid": "27afab40-1b84-4089-973a-32c8164be535", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 692.612090] env[60722]: DEBUG oslo_concurrency.lockutils [req-093d520f-9c52-4443-a1d7-2d88fdb2955f req-78e9225d-06c8-429e-a28c-c38a2df74368 service nova] Releasing lock "refresh_cache-4e66f1dc-18c6-4d64-9bbe-9b061e795a65" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 700.310410] env[60722]: DEBUG oslo_concurrency.lockutils [None req-29b03021-5125-4808-8eb6-162ef4d727f0 tempest-ServerActionsTestOtherB-2034715177 tempest-ServerActionsTestOtherB-2034715177-project-member] Acquiring lock "825f4673-e46f-4b32-a3d8-4bb163ac2390" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.310763] env[60722]: DEBUG oslo_concurrency.lockutils [None req-29b03021-5125-4808-8eb6-162ef4d727f0 tempest-ServerActionsTestOtherB-2034715177 tempest-ServerActionsTestOtherB-2034715177-project-member] Lock "825f4673-e46f-4b32-a3d8-4bb163ac2390" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 714.419956] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 714.944555] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 714.944774] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 714.944926] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60722) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 715.940294] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 715.943598] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 716.944547] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 716.944883] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Starting heal instance info cache {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 716.944883] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Rebuilding the list of instances to heal {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 716.964463] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 716.964614] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 716.964745] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 716.964872] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 716.964992] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 716.965127] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 716.965246] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 716.965362] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 716.965480] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 716.965596] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 716.965715] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Didn't find any instances for network info cache update. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 716.966130] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 716.966313] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 716.976741] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.976938] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.977105] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.977253] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60722) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 716.978324] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0293a452-169b-4606-acef-63b7ffba1e83 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.987320] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a0d994d-99cc-4cf2-a9e4-8684a512af1f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.001523] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efeac3a4-4530-4195-866b-f9be0b9eff55 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.007809] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bd1bda7-8bab-47e9-9186-7109509ea811 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.037501] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181728MB free_disk=100GB free_vcpus=48 pci_devices=None {{(pid=60722) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 717.037700] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 717.038126] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 717.104336] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance d1623803-5152-47d0-b1a7-5e8d4ab06233 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 717.104489] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance d09f5c24-d76b-4ff9-acdd-8da94d70f9cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 717.104615] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 65901b4a-42cf-4795-abc2-b0fea1f4fee7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 717.104735] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 1b18a8e4-eab9-4f28-bd87-a354c436b51c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 717.104854] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance bfde3558-9940-4402-bdf9-15c23c285a8f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 717.104970] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance bc2a1e45-2f48-4a73-bfee-69a20725a610 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 717.105149] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance e93b8d4b-6286-410a-870a-02fa7e59d90d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 717.105277] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 93268011-e1f2-4041-b4df-473c06d3f1eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 717.105391] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance eae8d9ce-9fe3-411e-9fd8-05920fb0af04 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 717.105503] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 4e66f1dc-18c6-4d64-9bbe-9b061e795a65 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 717.128151] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 22463917-2185-42f7-87b7-2b720be45c22 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 717.151408] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 1c4b8597-88ec-4e79-a749-f802803a5ffe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 717.161529] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 2786801d-6211-4598-b357-4f0a0ffdd7d1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 717.170449] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 019db29d-b8e4-4592-b7c4-2c044e2b2a51 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 717.179892] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 151df220-ca11-4455-b620-f8fe5a1be5b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 717.188157] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 47f34953-a5e3-4b5e-9164-ec5980802298 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 717.196940] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 7f1d5c92-ea40-4ad1-b669-877028b69711 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 717.205834] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance ecade932-ccf5-4a5c-8348-6d88a311f3a1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 717.215840] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance f60a4fc6-1a93-4f54-8b34-71aa0a2e035c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 717.223592] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 22eaddb2-108d-44e4-8d9d-85fc91efa9f9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 717.232063] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 825f4673-e46f-4b32-a3d8-4bb163ac2390 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 717.232280] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 717.232427] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1856MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 717.470484] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-176e90e4-4c71-4a43-b1ce-c2cce4c36e2f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.478249] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ced10b4-0106-48c2-9a21-5788807d9437 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.508427] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a393fd06-ee5f-40e8-8c73-39da1ccc1bde {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.515685] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73775bab-e993-4ced-ad09-70afcd6b335c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.529030] env[60722]: DEBUG nova.compute.provider_tree [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 717.537124] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 717.551519] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60722) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 717.551784] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.514s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.530228] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 732.865834] env[60722]: WARNING oslo_vmware.rw_handles [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 732.865834] env[60722]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 732.865834] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 732.865834] env[60722]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 732.865834] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 732.865834] env[60722]: ERROR oslo_vmware.rw_handles response.begin() [ 732.865834] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 732.865834] env[60722]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 732.865834] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 732.865834] env[60722]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 732.865834] env[60722]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 732.865834] env[60722]: ERROR oslo_vmware.rw_handles [ 732.866537] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Downloaded image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to vmware_temp/4e5eb705-ab90-4a3f-9afd-0b0a79997a2c/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 732.867794] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Caching image {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 732.868082] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Copying Virtual Disk [datastore1] vmware_temp/4e5eb705-ab90-4a3f-9afd-0b0a79997a2c/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk to [datastore1] vmware_temp/4e5eb705-ab90-4a3f-9afd-0b0a79997a2c/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk {{(pid=60722) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 732.868362] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1396f75f-6999-4ca4-997b-dc24149aa856 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 732.876245] env[60722]: DEBUG oslo_vmware.api [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Waiting for the task: (returnval){ [ 732.876245] env[60722]: value = "task-565161" [ 732.876245] env[60722]: _type = "Task" [ 732.876245] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 732.884873] env[60722]: DEBUG oslo_vmware.api [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Task: {'id': task-565161, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 733.386272] env[60722]: DEBUG oslo_vmware.exceptions [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Fault InvalidArgument not matched. {{(pid=60722) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 733.386521] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 733.387168] env[60722]: ERROR nova.compute.manager [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 733.387168] env[60722]: Faults: ['InvalidArgument'] [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Traceback (most recent call last): [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] yield resources [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] self.driver.spawn(context, instance, image_meta, [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] self._vmops.spawn(context, instance, image_meta, injected_files, [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] self._fetch_image_if_missing(context, vi) [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] image_cache(vi, tmp_image_ds_loc) [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] vm_util.copy_virtual_disk( [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] session._wait_for_task(vmdk_copy_task) [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] return self.wait_for_task(task_ref) [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] return evt.wait() [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] result = hub.switch() [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] return self.greenlet.switch() [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] self.f(*self.args, **self.kw) [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] raise exceptions.translate_fault(task_info.error) [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Faults: ['InvalidArgument'] [ 733.387168] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] [ 733.388156] env[60722]: INFO nova.compute.manager [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Terminating instance [ 733.389124] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 733.389330] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 733.390033] env[60722]: DEBUG nova.compute.manager [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 733.390231] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 733.390447] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-117f870b-10f4-49b3-8505-7525b4bdb7d4 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.393394] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f0ce706-7aba-4289-8f7c-466c7bcc7d5b {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.399996] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 733.400202] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0d15c478-2a0b-4c51-b346-56d9796a8da1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.402217] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 733.402378] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 733.403335] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-85fa27cb-27d4-4484-a228-0de1073d3dad {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.408182] env[60722]: DEBUG oslo_vmware.api [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Waiting for the task: (returnval){ [ 733.408182] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]529041bb-0e61-141d-7ea7-8c840194111c" [ 733.408182] env[60722]: _type = "Task" [ 733.408182] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 733.415828] env[60722]: DEBUG oslo_vmware.api [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]529041bb-0e61-141d-7ea7-8c840194111c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 733.468810] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 733.470628] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 733.470628] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Deleting the datastore file [datastore1] d1623803-5152-47d0-b1a7-5e8d4ab06233 {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 733.470628] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7055e728-c8d2-4413-bb85-d69cff826275 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.475445] env[60722]: DEBUG oslo_vmware.api [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Waiting for the task: (returnval){ [ 733.475445] env[60722]: value = "task-565163" [ 733.475445] env[60722]: _type = "Task" [ 733.475445] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 733.484328] env[60722]: DEBUG oslo_vmware.api [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Task: {'id': task-565163, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 733.918873] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 733.919155] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Creating directory with path [datastore1] vmware_temp/7c8a03c3-ca0b-4edc-89ed-c9bf03fd31f0/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 733.919374] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3d6bb3e2-b45f-4130-bfbe-d81d5dc982d2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.930153] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Created directory with path [datastore1] vmware_temp/7c8a03c3-ca0b-4edc-89ed-c9bf03fd31f0/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 733.930337] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Fetch image to [datastore1] vmware_temp/7c8a03c3-ca0b-4edc-89ed-c9bf03fd31f0/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 733.930499] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/7c8a03c3-ca0b-4edc-89ed-c9bf03fd31f0/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 733.931351] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a317fef4-540b-466b-a04b-90b0cbc78f10 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.937864] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4110449-d58d-4333-bda7-e653bbc8a05d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.947383] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5df5327a-ab72-49cf-ab6a-5ae468a78ed8 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.980180] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dec7fab-b001-4821-a6b7-c07ae8424293 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 733.986724] env[60722]: DEBUG oslo_vmware.api [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Task: {'id': task-565163, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072092} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 733.988114] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 733.988299] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 733.988463] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 733.988630] env[60722]: INFO nova.compute.manager [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Took 0.60 seconds to destroy the instance on the hypervisor. [ 733.990619] env[60722]: DEBUG nova.compute.claims [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 733.990784] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 733.990985] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 733.993318] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fa960d49-602c-486c-925a-6f18df9fd40d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.081020] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 734.128969] env[60722]: DEBUG oslo_vmware.rw_handles [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7c8a03c3-ca0b-4edc-89ed-c9bf03fd31f0/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 734.183488] env[60722]: DEBUG oslo_vmware.rw_handles [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Completed reading data from the image iterator. {{(pid=60722) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 734.183651] env[60722]: DEBUG oslo_vmware.rw_handles [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7c8a03c3-ca0b-4edc-89ed-c9bf03fd31f0/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 734.333611] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c126d2d0-b16d-4ffc-a855-29dccace5ca3 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.341164] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b72d5ef9-3690-4b89-9468-2b3df0ef4df5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.370847] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-345e3201-6c35-4a56-95eb-23a45bc242fa {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.377668] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ab9ebb9-324f-41c0-857d-0ca125ce11d1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 734.389999] env[60722]: DEBUG nova.compute.provider_tree [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 734.399740] env[60722]: DEBUG nova.scheduler.client.report [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 734.414277] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.423s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 734.414782] env[60722]: ERROR nova.compute.manager [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 734.414782] env[60722]: Faults: ['InvalidArgument'] [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Traceback (most recent call last): [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] self.driver.spawn(context, instance, image_meta, [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] self._vmops.spawn(context, instance, image_meta, injected_files, [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] self._fetch_image_if_missing(context, vi) [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] image_cache(vi, tmp_image_ds_loc) [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] vm_util.copy_virtual_disk( [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] session._wait_for_task(vmdk_copy_task) [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] return self.wait_for_task(task_ref) [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] return evt.wait() [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] result = hub.switch() [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] return self.greenlet.switch() [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] self.f(*self.args, **self.kw) [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] raise exceptions.translate_fault(task_info.error) [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Faults: ['InvalidArgument'] [ 734.414782] env[60722]: ERROR nova.compute.manager [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] [ 734.415647] env[60722]: DEBUG nova.compute.utils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] VimFaultException {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 734.416750] env[60722]: DEBUG nova.compute.manager [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Build of instance d1623803-5152-47d0-b1a7-5e8d4ab06233 was re-scheduled: A specified parameter was not correct: fileType [ 734.416750] env[60722]: Faults: ['InvalidArgument'] {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 734.417118] env[60722]: DEBUG nova.compute.manager [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 734.417283] env[60722]: DEBUG nova.compute.manager [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 734.417454] env[60722]: DEBUG nova.compute.manager [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 734.417609] env[60722]: DEBUG nova.network.neutron [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 734.654387] env[60722]: DEBUG nova.network.neutron [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 734.668769] env[60722]: INFO nova.compute.manager [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] Took 0.25 seconds to deallocate network for instance. [ 734.756843] env[60722]: INFO nova.scheduler.client.report [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Deleted allocations for instance d1623803-5152-47d0-b1a7-5e8d4ab06233 [ 734.775249] env[60722]: DEBUG oslo_concurrency.lockutils [None req-853f8d15-da0e-405f-8d2c-959ed7a77d75 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "d1623803-5152-47d0-b1a7-5e8d4ab06233" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 146.486s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 734.776316] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "d1623803-5152-47d0-b1a7-5e8d4ab06233" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 142.595s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 734.776501] env[60722]: INFO nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: d1623803-5152-47d0-b1a7-5e8d4ab06233] During sync_power_state the instance has a pending task (spawning). Skip. [ 734.776872] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "d1623803-5152-47d0-b1a7-5e8d4ab06233" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 734.798168] env[60722]: DEBUG nova.compute.manager [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 734.846632] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 734.846864] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 734.848344] env[60722]: INFO nova.compute.claims [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 735.097608] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f631fc47-5795-4bb0-ab9d-a04dab7a0546 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.106049] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e305537e-021f-403f-90b7-2a8b9cb51c0f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.133933] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65d4e4bd-08eb-40aa-82e1-ef17e168e2c6 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.141025] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9738e58-9bd1-4ba6-a51b-a4d1f65c043f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.153753] env[60722]: DEBUG nova.compute.provider_tree [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 735.182147] env[60722]: DEBUG nova.scheduler.client.report [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 735.195537] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.349s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 735.196058] env[60722]: DEBUG nova.compute.manager [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 735.228890] env[60722]: DEBUG nova.compute.utils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 735.229702] env[60722]: DEBUG nova.compute.manager [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 735.229769] env[60722]: DEBUG nova.network.neutron [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 735.239149] env[60722]: DEBUG nova.compute.manager [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 735.302015] env[60722]: DEBUG nova.compute.manager [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 735.324466] env[60722]: DEBUG nova.policy [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ec62f6cba98944b5bf95f4ae8add3ac5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '06180241ffc149bcbc3a41d48d380762', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 735.327806] env[60722]: DEBUG nova.virt.hardware [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 735.328039] env[60722]: DEBUG nova.virt.hardware [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 735.328193] env[60722]: DEBUG nova.virt.hardware [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 735.328413] env[60722]: DEBUG nova.virt.hardware [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 735.328564] env[60722]: DEBUG nova.virt.hardware [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 735.328708] env[60722]: DEBUG nova.virt.hardware [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 735.328916] env[60722]: DEBUG nova.virt.hardware [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 735.329074] env[60722]: DEBUG nova.virt.hardware [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 735.329236] env[60722]: DEBUG nova.virt.hardware [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 735.329393] env[60722]: DEBUG nova.virt.hardware [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 735.329584] env[60722]: DEBUG nova.virt.hardware [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 735.330709] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e55cbc3-1f15-49be-8f00-be2217c6cd72 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.338758] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65d26836-c326-4570-b0f3-88a27dc613b3 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 735.659765] env[60722]: DEBUG nova.network.neutron [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Successfully created port: 3493a62c-0ddc-4402-9192-74f0d77f00d2 {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 736.278762] env[60722]: DEBUG nova.network.neutron [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Successfully updated port: 3493a62c-0ddc-4402-9192-74f0d77f00d2 {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 736.291941] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Acquiring lock "refresh_cache-22463917-2185-42f7-87b7-2b720be45c22" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 736.292336] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Acquired lock "refresh_cache-22463917-2185-42f7-87b7-2b720be45c22" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 736.293337] env[60722]: DEBUG nova.network.neutron [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 736.363110] env[60722]: DEBUG nova.network.neutron [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 736.565100] env[60722]: DEBUG nova.network.neutron [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Updating instance_info_cache with network_info: [{"id": "3493a62c-0ddc-4402-9192-74f0d77f00d2", "address": "fa:16:3e:80:72:4b", "network": {"id": "dace1c00-c6b0-4bd3-a37a-e783f0c57575", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1193565109-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "06180241ffc149bcbc3a41d48d380762", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9079d3b9-5c2d-4ca1-8d2f-68ceb8ec8c98", "external-id": "nsx-vlan-transportzone-527", "segmentation_id": 527, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3493a62c-0d", "ovs_interfaceid": "3493a62c-0ddc-4402-9192-74f0d77f00d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 736.576038] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Releasing lock "refresh_cache-22463917-2185-42f7-87b7-2b720be45c22" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 736.576293] env[60722]: DEBUG nova.compute.manager [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Instance network_info: |[{"id": "3493a62c-0ddc-4402-9192-74f0d77f00d2", "address": "fa:16:3e:80:72:4b", "network": {"id": "dace1c00-c6b0-4bd3-a37a-e783f0c57575", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1193565109-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "06180241ffc149bcbc3a41d48d380762", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9079d3b9-5c2d-4ca1-8d2f-68ceb8ec8c98", "external-id": "nsx-vlan-transportzone-527", "segmentation_id": 527, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3493a62c-0d", "ovs_interfaceid": "3493a62c-0ddc-4402-9192-74f0d77f00d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 736.576653] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:80:72:4b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '9079d3b9-5c2d-4ca1-8d2f-68ceb8ec8c98', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3493a62c-0ddc-4402-9192-74f0d77f00d2', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 736.584839] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Creating folder: Project (06180241ffc149bcbc3a41d48d380762). Parent ref: group-v141606. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 736.585327] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0ef82be1-27bb-4877-9a09-fca0d31e83e2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.596054] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Created folder: Project (06180241ffc149bcbc3a41d48d380762) in parent group-v141606. [ 736.597110] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Creating folder: Instances. Parent ref: group-v141641. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 736.597110] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ca7e1e41-5c3e-4655-9275-830909cc7cbb {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.604939] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Created folder: Instances in parent group-v141641. [ 736.605139] env[60722]: DEBUG oslo.service.loopingcall [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 736.605308] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 736.605485] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-927fa61e-a54d-4b99-8d40-7cfef533aadc {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 736.624497] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 736.624497] env[60722]: value = "task-565166" [ 736.624497] env[60722]: _type = "Task" [ 736.624497] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 736.631936] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565166, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 736.722339] env[60722]: DEBUG nova.compute.manager [req-2e2e343f-bf54-4f4f-8bae-515caf4a65ba req-ef909079-cedd-40f9-8fd9-3c867741f1a1 service nova] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Received event network-vif-plugged-3493a62c-0ddc-4402-9192-74f0d77f00d2 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 736.722339] env[60722]: DEBUG oslo_concurrency.lockutils [req-2e2e343f-bf54-4f4f-8bae-515caf4a65ba req-ef909079-cedd-40f9-8fd9-3c867741f1a1 service nova] Acquiring lock "22463917-2185-42f7-87b7-2b720be45c22-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 736.722339] env[60722]: DEBUG oslo_concurrency.lockutils [req-2e2e343f-bf54-4f4f-8bae-515caf4a65ba req-ef909079-cedd-40f9-8fd9-3c867741f1a1 service nova] Lock "22463917-2185-42f7-87b7-2b720be45c22-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 736.722339] env[60722]: DEBUG oslo_concurrency.lockutils [req-2e2e343f-bf54-4f4f-8bae-515caf4a65ba req-ef909079-cedd-40f9-8fd9-3c867741f1a1 service nova] Lock "22463917-2185-42f7-87b7-2b720be45c22-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 736.722339] env[60722]: DEBUG nova.compute.manager [req-2e2e343f-bf54-4f4f-8bae-515caf4a65ba req-ef909079-cedd-40f9-8fd9-3c867741f1a1 service nova] [instance: 22463917-2185-42f7-87b7-2b720be45c22] No waiting events found dispatching network-vif-plugged-3493a62c-0ddc-4402-9192-74f0d77f00d2 {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 736.722505] env[60722]: WARNING nova.compute.manager [req-2e2e343f-bf54-4f4f-8bae-515caf4a65ba req-ef909079-cedd-40f9-8fd9-3c867741f1a1 service nova] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Received unexpected event network-vif-plugged-3493a62c-0ddc-4402-9192-74f0d77f00d2 for instance with vm_state building and task_state spawning. [ 736.722642] env[60722]: DEBUG nova.compute.manager [req-2e2e343f-bf54-4f4f-8bae-515caf4a65ba req-ef909079-cedd-40f9-8fd9-3c867741f1a1 service nova] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Received event network-changed-3493a62c-0ddc-4402-9192-74f0d77f00d2 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 736.722873] env[60722]: DEBUG nova.compute.manager [req-2e2e343f-bf54-4f4f-8bae-515caf4a65ba req-ef909079-cedd-40f9-8fd9-3c867741f1a1 service nova] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Refreshing instance network info cache due to event network-changed-3493a62c-0ddc-4402-9192-74f0d77f00d2. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 736.723128] env[60722]: DEBUG oslo_concurrency.lockutils [req-2e2e343f-bf54-4f4f-8bae-515caf4a65ba req-ef909079-cedd-40f9-8fd9-3c867741f1a1 service nova] Acquiring lock "refresh_cache-22463917-2185-42f7-87b7-2b720be45c22" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 736.723273] env[60722]: DEBUG oslo_concurrency.lockutils [req-2e2e343f-bf54-4f4f-8bae-515caf4a65ba req-ef909079-cedd-40f9-8fd9-3c867741f1a1 service nova] Acquired lock "refresh_cache-22463917-2185-42f7-87b7-2b720be45c22" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 736.723429] env[60722]: DEBUG nova.network.neutron [req-2e2e343f-bf54-4f4f-8bae-515caf4a65ba req-ef909079-cedd-40f9-8fd9-3c867741f1a1 service nova] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Refreshing network info cache for port 3493a62c-0ddc-4402-9192-74f0d77f00d2 {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 736.996462] env[60722]: DEBUG nova.network.neutron [req-2e2e343f-bf54-4f4f-8bae-515caf4a65ba req-ef909079-cedd-40f9-8fd9-3c867741f1a1 service nova] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Updated VIF entry in instance network info cache for port 3493a62c-0ddc-4402-9192-74f0d77f00d2. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 736.996821] env[60722]: DEBUG nova.network.neutron [req-2e2e343f-bf54-4f4f-8bae-515caf4a65ba req-ef909079-cedd-40f9-8fd9-3c867741f1a1 service nova] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Updating instance_info_cache with network_info: [{"id": "3493a62c-0ddc-4402-9192-74f0d77f00d2", "address": "fa:16:3e:80:72:4b", "network": {"id": "dace1c00-c6b0-4bd3-a37a-e783f0c57575", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1193565109-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "06180241ffc149bcbc3a41d48d380762", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "9079d3b9-5c2d-4ca1-8d2f-68ceb8ec8c98", "external-id": "nsx-vlan-transportzone-527", "segmentation_id": 527, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3493a62c-0d", "ovs_interfaceid": "3493a62c-0ddc-4402-9192-74f0d77f00d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 737.006218] env[60722]: DEBUG oslo_concurrency.lockutils [req-2e2e343f-bf54-4f4f-8bae-515caf4a65ba req-ef909079-cedd-40f9-8fd9-3c867741f1a1 service nova] Releasing lock "refresh_cache-22463917-2185-42f7-87b7-2b720be45c22" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 737.133959] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565166, 'name': CreateVM_Task, 'duration_secs': 0.298556} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 737.134132] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 737.134761] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 737.134915] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 737.135241] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 737.135470] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-66af6fa4-8bad-4347-baa9-2cc65f8bf972 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 737.140602] env[60722]: DEBUG oslo_vmware.api [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Waiting for the task: (returnval){ [ 737.140602] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52590bc9-eb6a-b823-ab74-6bc93bf476d6" [ 737.140602] env[60722]: _type = "Task" [ 737.140602] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 737.147892] env[60722]: DEBUG oslo_vmware.api [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52590bc9-eb6a-b823-ab74-6bc93bf476d6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 737.651105] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 737.651397] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 737.651516] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 774.944442] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 774.944742] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60722) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 775.940520] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 775.944048] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 775.944193] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 776.945980] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 776.945980] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Starting heal instance info cache {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 776.945980] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Rebuilding the list of instances to heal {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 776.965166] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 776.966025] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 776.966025] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 776.966025] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 776.966025] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 776.966025] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 776.966322] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 776.966322] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 776.966414] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 776.966461] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 776.966583] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Didn't find any instances for network info cache update. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 776.967063] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 776.967254] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 777.944555] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 777.966499] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 777.975707] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 777.975930] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 777.976156] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 777.976315] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60722) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 777.977348] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3c55c1c-283c-499d-897f-2d398a3ca538 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 777.986240] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0ddc1c0-40aa-4423-9a8b-447b32289cc1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.000211] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7829e919-8679-4dd6-9139-ed80dbc9d1b3 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.006642] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb4f0efc-61d9-4b83-9bd2-9189572b6578 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.035184] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181715MB free_disk=100GB free_vcpus=48 pci_devices=None {{(pid=60722) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 778.035344] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 778.035530] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 778.100201] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance d09f5c24-d76b-4ff9-acdd-8da94d70f9cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 778.100394] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 65901b4a-42cf-4795-abc2-b0fea1f4fee7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 778.100529] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 1b18a8e4-eab9-4f28-bd87-a354c436b51c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 778.100655] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance bfde3558-9940-4402-bdf9-15c23c285a8f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 778.100770] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance bc2a1e45-2f48-4a73-bfee-69a20725a610 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 778.100949] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance e93b8d4b-6286-410a-870a-02fa7e59d90d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 778.101138] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 93268011-e1f2-4041-b4df-473c06d3f1eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 778.101275] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance eae8d9ce-9fe3-411e-9fd8-05920fb0af04 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 778.101391] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 4e66f1dc-18c6-4d64-9bbe-9b061e795a65 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 778.101501] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 22463917-2185-42f7-87b7-2b720be45c22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 778.112158] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 1c4b8597-88ec-4e79-a749-f802803a5ffe has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 778.122784] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 2786801d-6211-4598-b357-4f0a0ffdd7d1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 778.132163] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 019db29d-b8e4-4592-b7c4-2c044e2b2a51 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 778.141781] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 151df220-ca11-4455-b620-f8fe5a1be5b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 778.150408] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 47f34953-a5e3-4b5e-9164-ec5980802298 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 778.159200] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 7f1d5c92-ea40-4ad1-b669-877028b69711 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 778.169018] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance ecade932-ccf5-4a5c-8348-6d88a311f3a1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 778.177378] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance f60a4fc6-1a93-4f54-8b34-71aa0a2e035c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 778.187305] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 22eaddb2-108d-44e4-8d9d-85fc91efa9f9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 778.196908] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 825f4673-e46f-4b32-a3d8-4bb163ac2390 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 778.196908] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 778.196908] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1856MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 778.405706] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b477c6b-590f-41f5-bd1d-7c27436c10a2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.413563] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e450c9a-179b-46d5-992c-0c70dba0dc0e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.442947] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-680068d4-387d-4e3f-87e9-1b376bac71cc {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.448989] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe07a656-7c50-4082-8f12-8512ed09fd2c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 778.461969] env[60722]: DEBUG nova.compute.provider_tree [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 778.470479] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 778.484398] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60722) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 778.484572] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.449s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 779.463351] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 780.175440] env[60722]: WARNING oslo_vmware.rw_handles [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 780.175440] env[60722]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 780.175440] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 780.175440] env[60722]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 780.175440] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 780.175440] env[60722]: ERROR oslo_vmware.rw_handles response.begin() [ 780.175440] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 780.175440] env[60722]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 780.175440] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 780.175440] env[60722]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 780.175440] env[60722]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 780.175440] env[60722]: ERROR oslo_vmware.rw_handles [ 780.175803] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Downloaded image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to vmware_temp/7c8a03c3-ca0b-4edc-89ed-c9bf03fd31f0/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 780.177348] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Caching image {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 780.177589] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Copying Virtual Disk [datastore1] vmware_temp/7c8a03c3-ca0b-4edc-89ed-c9bf03fd31f0/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk to [datastore1] vmware_temp/7c8a03c3-ca0b-4edc-89ed-c9bf03fd31f0/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk {{(pid=60722) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 780.177856] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-251b4411-df4e-46f9-aa7b-0ad493156481 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 780.186581] env[60722]: DEBUG oslo_vmware.api [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Waiting for the task: (returnval){ [ 780.186581] env[60722]: value = "task-565167" [ 780.186581] env[60722]: _type = "Task" [ 780.186581] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 780.194405] env[60722]: DEBUG oslo_vmware.api [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Task: {'id': task-565167, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 780.697722] env[60722]: DEBUG oslo_vmware.exceptions [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Fault InvalidArgument not matched. {{(pid=60722) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 780.697984] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 780.698654] env[60722]: ERROR nova.compute.manager [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 780.698654] env[60722]: Faults: ['InvalidArgument'] [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Traceback (most recent call last): [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] yield resources [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] self.driver.spawn(context, instance, image_meta, [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] self._fetch_image_if_missing(context, vi) [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] image_cache(vi, tmp_image_ds_loc) [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] vm_util.copy_virtual_disk( [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] session._wait_for_task(vmdk_copy_task) [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] return self.wait_for_task(task_ref) [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] return evt.wait() [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] result = hub.switch() [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] return self.greenlet.switch() [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] self.f(*self.args, **self.kw) [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] raise exceptions.translate_fault(task_info.error) [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Faults: ['InvalidArgument'] [ 780.698654] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] [ 780.699699] env[60722]: INFO nova.compute.manager [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Terminating instance [ 780.700432] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 780.700631] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 780.700848] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2c87f43e-5326-4a0e-9953-57134554b719 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 780.703051] env[60722]: DEBUG nova.compute.manager [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 780.703239] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 780.703991] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e6d8e09-9635-4256-ac5d-1e74aec18479 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 780.710772] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 780.710975] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7f2aa34e-2733-4cb7-82d4-b75fae44dd51 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 780.713163] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 780.713329] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 780.714253] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8cb42fe6-4363-4d76-af7b-7ab93e7655f2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 780.718712] env[60722]: DEBUG oslo_vmware.api [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Waiting for the task: (returnval){ [ 780.718712] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52902eba-1167-68f4-4701-84a538dd55dc" [ 780.718712] env[60722]: _type = "Task" [ 780.718712] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 780.725746] env[60722]: DEBUG oslo_vmware.api [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52902eba-1167-68f4-4701-84a538dd55dc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 780.790615] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 780.790863] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 780.791054] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Deleting the datastore file [datastore1] d09f5c24-d76b-4ff9-acdd-8da94d70f9cb {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 780.791308] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-33980c59-a6ad-43f1-a960-f7e29df6155e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 780.797845] env[60722]: DEBUG oslo_vmware.api [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Waiting for the task: (returnval){ [ 780.797845] env[60722]: value = "task-565169" [ 780.797845] env[60722]: _type = "Task" [ 780.797845] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 780.808903] env[60722]: DEBUG oslo_vmware.api [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Task: {'id': task-565169, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 781.228368] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 781.228609] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Creating directory with path [datastore1] vmware_temp/0a15ee81-db1d-4fe8-8952-8730e44b02f4/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 781.228822] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c5c684ec-20f3-4f03-84a5-e7bff42c8d82 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 781.239796] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Created directory with path [datastore1] vmware_temp/0a15ee81-db1d-4fe8-8952-8730e44b02f4/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 781.239971] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Fetch image to [datastore1] vmware_temp/0a15ee81-db1d-4fe8-8952-8730e44b02f4/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 781.240147] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/0a15ee81-db1d-4fe8-8952-8730e44b02f4/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 781.240812] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16a2fc8f-f8a5-409b-a859-9255e8a2378b {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 781.246801] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a5eca25-07ea-44aa-a7fe-9515ba9abeec {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 781.255335] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f995a2de-b16a-4a3c-8717-312b0193888f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 781.286447] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1b1604d-1765-4661-b2f1-45ceca7cdb4a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 781.292813] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-727dc521-849a-48ae-a926-a3bfdb17ee55 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 781.306927] env[60722]: DEBUG oslo_vmware.api [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Task: {'id': task-565169, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064416} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 781.307185] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 781.307364] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 781.307586] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 781.307768] env[60722]: INFO nova.compute.manager [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Took 0.60 seconds to destroy the instance on the hypervisor. [ 781.310202] env[60722]: DEBUG nova.compute.claims [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 781.310370] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 781.310682] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 781.314917] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 781.363474] env[60722]: DEBUG oslo_vmware.rw_handles [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0a15ee81-db1d-4fe8-8952-8730e44b02f4/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 781.420806] env[60722]: DEBUG oslo_vmware.rw_handles [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Completed reading data from the image iterator. {{(pid=60722) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 781.421184] env[60722]: DEBUG oslo_vmware.rw_handles [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0a15ee81-db1d-4fe8-8952-8730e44b02f4/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 781.621525] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b6366fa-945e-4705-a38e-ef4b26b4e31d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 781.629102] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d702427-4c2b-489c-b6d3-4ecec0f58783 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 781.658842] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cbb0e11-83b1-40a5-81dc-2d104013e97c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 781.665642] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9b0f066-5cc1-4d5c-a709-07e63c7fe459 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 781.678369] env[60722]: DEBUG nova.compute.provider_tree [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 781.688547] env[60722]: DEBUG nova.scheduler.client.report [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 781.702854] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.392s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 781.703396] env[60722]: ERROR nova.compute.manager [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 781.703396] env[60722]: Faults: ['InvalidArgument'] [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Traceback (most recent call last): [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] self.driver.spawn(context, instance, image_meta, [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] self._fetch_image_if_missing(context, vi) [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] image_cache(vi, tmp_image_ds_loc) [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] vm_util.copy_virtual_disk( [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] session._wait_for_task(vmdk_copy_task) [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] return self.wait_for_task(task_ref) [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] return evt.wait() [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] result = hub.switch() [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] return self.greenlet.switch() [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] self.f(*self.args, **self.kw) [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] raise exceptions.translate_fault(task_info.error) [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Faults: ['InvalidArgument'] [ 781.703396] env[60722]: ERROR nova.compute.manager [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] [ 781.704179] env[60722]: DEBUG nova.compute.utils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] VimFaultException {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 781.705605] env[60722]: DEBUG nova.compute.manager [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Build of instance d09f5c24-d76b-4ff9-acdd-8da94d70f9cb was re-scheduled: A specified parameter was not correct: fileType [ 781.705605] env[60722]: Faults: ['InvalidArgument'] {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 781.705977] env[60722]: DEBUG nova.compute.manager [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 781.706161] env[60722]: DEBUG nova.compute.manager [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 781.706324] env[60722]: DEBUG nova.compute.manager [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 781.706480] env[60722]: DEBUG nova.network.neutron [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 782.020052] env[60722]: DEBUG nova.network.neutron [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 782.027256] env[60722]: INFO nova.compute.manager [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] Took 0.32 seconds to deallocate network for instance. [ 782.115849] env[60722]: INFO nova.scheduler.client.report [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Deleted allocations for instance d09f5c24-d76b-4ff9-acdd-8da94d70f9cb [ 782.131298] env[60722]: DEBUG oslo_concurrency.lockutils [None req-45621045-fc1a-4d63-b7f4-e51a37c52430 tempest-ServersTestJSON-1160895630 tempest-ServersTestJSON-1160895630-project-member] Lock "d09f5c24-d76b-4ff9-acdd-8da94d70f9cb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 191.569s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 782.132298] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "d09f5c24-d76b-4ff9-acdd-8da94d70f9cb" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 189.951s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 782.132476] env[60722]: INFO nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: d09f5c24-d76b-4ff9-acdd-8da94d70f9cb] During sync_power_state the instance has a pending task (spawning). Skip. [ 782.132641] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "d09f5c24-d76b-4ff9-acdd-8da94d70f9cb" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 782.148684] env[60722]: DEBUG nova.compute.manager [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 782.195025] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 782.195025] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 782.196289] env[60722]: INFO nova.compute.claims [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 782.437864] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64b529bb-82de-4a25-9e97-6442c5593973 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 782.445561] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3329118-bfa7-4785-9123-45566fbaffe0 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 782.475074] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d85e319-121c-4c3a-af99-b139bef88516 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 782.482958] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9336c6d5-6768-4fe3-957a-d984f9d3bfd5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 782.495430] env[60722]: DEBUG nova.compute.provider_tree [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 782.506395] env[60722]: DEBUG nova.scheduler.client.report [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 782.521564] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.327s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 782.522039] env[60722]: DEBUG nova.compute.manager [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 782.553639] env[60722]: DEBUG nova.compute.utils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 782.555122] env[60722]: DEBUG nova.compute.manager [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 782.555367] env[60722]: DEBUG nova.network.neutron [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 782.563207] env[60722]: DEBUG nova.compute.manager [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 782.625803] env[60722]: DEBUG nova.policy [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f39d600f4f34748a0a0e8053ca5a845', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '68d27ef896b548ed88853b436758ebab', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 782.629159] env[60722]: DEBUG nova.compute.manager [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 782.651262] env[60722]: DEBUG nova.virt.hardware [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 782.651498] env[60722]: DEBUG nova.virt.hardware [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 782.651651] env[60722]: DEBUG nova.virt.hardware [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 782.651825] env[60722]: DEBUG nova.virt.hardware [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 782.651964] env[60722]: DEBUG nova.virt.hardware [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 782.652120] env[60722]: DEBUG nova.virt.hardware [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 782.652320] env[60722]: DEBUG nova.virt.hardware [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 782.652472] env[60722]: DEBUG nova.virt.hardware [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 782.652631] env[60722]: DEBUG nova.virt.hardware [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 782.652849] env[60722]: DEBUG nova.virt.hardware [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 782.653039] env[60722]: DEBUG nova.virt.hardware [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 782.653886] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e21c8c1-e134-43fe-8e00-85467f50fe25 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 782.661797] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6f536a2-0812-405b-8fad-24b13fb50ca9 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 783.146675] env[60722]: DEBUG nova.network.neutron [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Successfully created port: 5c5a23cd-e91f-4ba3-8aac-d344adde8784 {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 783.943193] env[60722]: DEBUG nova.network.neutron [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Successfully updated port: 5c5a23cd-e91f-4ba3-8aac-d344adde8784 {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 783.951427] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Acquiring lock "refresh_cache-1c4b8597-88ec-4e79-a749-f802803a5ffe" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 783.951565] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Acquired lock "refresh_cache-1c4b8597-88ec-4e79-a749-f802803a5ffe" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 783.951713] env[60722]: DEBUG nova.network.neutron [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 783.992158] env[60722]: DEBUG nova.network.neutron [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 784.048734] env[60722]: DEBUG nova.compute.manager [req-8f6700f3-b3cf-4226-b96e-866d224b1c6c req-bcc3d4c1-1a50-4e39-8934-4c121c120529 service nova] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Received event network-vif-plugged-5c5a23cd-e91f-4ba3-8aac-d344adde8784 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 784.048868] env[60722]: DEBUG oslo_concurrency.lockutils [req-8f6700f3-b3cf-4226-b96e-866d224b1c6c req-bcc3d4c1-1a50-4e39-8934-4c121c120529 service nova] Acquiring lock "1c4b8597-88ec-4e79-a749-f802803a5ffe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 784.049069] env[60722]: DEBUG oslo_concurrency.lockutils [req-8f6700f3-b3cf-4226-b96e-866d224b1c6c req-bcc3d4c1-1a50-4e39-8934-4c121c120529 service nova] Lock "1c4b8597-88ec-4e79-a749-f802803a5ffe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 784.049258] env[60722]: DEBUG oslo_concurrency.lockutils [req-8f6700f3-b3cf-4226-b96e-866d224b1c6c req-bcc3d4c1-1a50-4e39-8934-4c121c120529 service nova] Lock "1c4b8597-88ec-4e79-a749-f802803a5ffe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 784.049381] env[60722]: DEBUG nova.compute.manager [req-8f6700f3-b3cf-4226-b96e-866d224b1c6c req-bcc3d4c1-1a50-4e39-8934-4c121c120529 service nova] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] No waiting events found dispatching network-vif-plugged-5c5a23cd-e91f-4ba3-8aac-d344adde8784 {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 784.049540] env[60722]: WARNING nova.compute.manager [req-8f6700f3-b3cf-4226-b96e-866d224b1c6c req-bcc3d4c1-1a50-4e39-8934-4c121c120529 service nova] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Received unexpected event network-vif-plugged-5c5a23cd-e91f-4ba3-8aac-d344adde8784 for instance with vm_state building and task_state spawning. [ 784.049693] env[60722]: DEBUG nova.compute.manager [req-8f6700f3-b3cf-4226-b96e-866d224b1c6c req-bcc3d4c1-1a50-4e39-8934-4c121c120529 service nova] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Received event network-changed-5c5a23cd-e91f-4ba3-8aac-d344adde8784 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 784.049842] env[60722]: DEBUG nova.compute.manager [req-8f6700f3-b3cf-4226-b96e-866d224b1c6c req-bcc3d4c1-1a50-4e39-8934-4c121c120529 service nova] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Refreshing instance network info cache due to event network-changed-5c5a23cd-e91f-4ba3-8aac-d344adde8784. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 784.050008] env[60722]: DEBUG oslo_concurrency.lockutils [req-8f6700f3-b3cf-4226-b96e-866d224b1c6c req-bcc3d4c1-1a50-4e39-8934-4c121c120529 service nova] Acquiring lock "refresh_cache-1c4b8597-88ec-4e79-a749-f802803a5ffe" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 784.178572] env[60722]: DEBUG nova.network.neutron [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Updating instance_info_cache with network_info: [{"id": "5c5a23cd-e91f-4ba3-8aac-d344adde8784", "address": "fa:16:3e:ad:b6:af", "network": {"id": "1e45a79d-0d67-49e2-b05f-7562681d0fd6", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1474541096-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "68d27ef896b548ed88853b436758ebab", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a316376e-2ef0-4b1e-b40c-10321ebd7e1a", "external-id": "nsx-vlan-transportzone-942", "segmentation_id": 942, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c5a23cd-e9", "ovs_interfaceid": "5c5a23cd-e91f-4ba3-8aac-d344adde8784", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 784.195858] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Releasing lock "refresh_cache-1c4b8597-88ec-4e79-a749-f802803a5ffe" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 784.196015] env[60722]: DEBUG nova.compute.manager [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Instance network_info: |[{"id": "5c5a23cd-e91f-4ba3-8aac-d344adde8784", "address": "fa:16:3e:ad:b6:af", "network": {"id": "1e45a79d-0d67-49e2-b05f-7562681d0fd6", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1474541096-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "68d27ef896b548ed88853b436758ebab", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a316376e-2ef0-4b1e-b40c-10321ebd7e1a", "external-id": "nsx-vlan-transportzone-942", "segmentation_id": 942, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c5a23cd-e9", "ovs_interfaceid": "5c5a23cd-e91f-4ba3-8aac-d344adde8784", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 784.196310] env[60722]: DEBUG oslo_concurrency.lockutils [req-8f6700f3-b3cf-4226-b96e-866d224b1c6c req-bcc3d4c1-1a50-4e39-8934-4c121c120529 service nova] Acquired lock "refresh_cache-1c4b8597-88ec-4e79-a749-f802803a5ffe" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 784.196496] env[60722]: DEBUG nova.network.neutron [req-8f6700f3-b3cf-4226-b96e-866d224b1c6c req-bcc3d4c1-1a50-4e39-8934-4c121c120529 service nova] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Refreshing network info cache for port 5c5a23cd-e91f-4ba3-8aac-d344adde8784 {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 784.201521] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ad:b6:af', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a316376e-2ef0-4b1e-b40c-10321ebd7e1a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5c5a23cd-e91f-4ba3-8aac-d344adde8784', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 784.205185] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Creating folder: Project (68d27ef896b548ed88853b436758ebab). Parent ref: group-v141606. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 784.206120] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7d19c915-0443-44b6-b8cb-24f7e48dd78f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 784.219302] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Created folder: Project (68d27ef896b548ed88853b436758ebab) in parent group-v141606. [ 784.219480] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Creating folder: Instances. Parent ref: group-v141644. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 784.219699] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-82fddd7e-4859-4524-805d-b90c522b5e29 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 784.231930] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Created folder: Instances in parent group-v141644. [ 784.232192] env[60722]: DEBUG oslo.service.loopingcall [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 784.232335] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 784.232520] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f108c241-d115-4e4c-9c8b-ca92208dcef1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 784.259970] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 784.259970] env[60722]: value = "task-565172" [ 784.259970] env[60722]: _type = "Task" [ 784.259970] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 784.268502] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565172, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 784.525305] env[60722]: DEBUG nova.network.neutron [req-8f6700f3-b3cf-4226-b96e-866d224b1c6c req-bcc3d4c1-1a50-4e39-8934-4c121c120529 service nova] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Updated VIF entry in instance network info cache for port 5c5a23cd-e91f-4ba3-8aac-d344adde8784. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 784.525630] env[60722]: DEBUG nova.network.neutron [req-8f6700f3-b3cf-4226-b96e-866d224b1c6c req-bcc3d4c1-1a50-4e39-8934-4c121c120529 service nova] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Updating instance_info_cache with network_info: [{"id": "5c5a23cd-e91f-4ba3-8aac-d344adde8784", "address": "fa:16:3e:ad:b6:af", "network": {"id": "1e45a79d-0d67-49e2-b05f-7562681d0fd6", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1474541096-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "68d27ef896b548ed88853b436758ebab", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a316376e-2ef0-4b1e-b40c-10321ebd7e1a", "external-id": "nsx-vlan-transportzone-942", "segmentation_id": 942, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c5a23cd-e9", "ovs_interfaceid": "5c5a23cd-e91f-4ba3-8aac-d344adde8784", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 784.539250] env[60722]: DEBUG oslo_concurrency.lockutils [req-8f6700f3-b3cf-4226-b96e-866d224b1c6c req-bcc3d4c1-1a50-4e39-8934-4c121c120529 service nova] Releasing lock "refresh_cache-1c4b8597-88ec-4e79-a749-f802803a5ffe" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 784.771659] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565172, 'name': CreateVM_Task, 'duration_secs': 0.278336} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 784.773538] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 784.774226] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 784.774379] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 784.774705] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 784.774935] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e82138ae-ce76-472a-9551-d6760c49fdaf {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 784.780204] env[60722]: DEBUG oslo_vmware.api [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Waiting for the task: (returnval){ [ 784.780204] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52f7b035-dde9-16ea-955e-adda3ad3fb47" [ 784.780204] env[60722]: _type = "Task" [ 784.780204] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 784.788138] env[60722]: DEBUG oslo_vmware.api [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52f7b035-dde9-16ea-955e-adda3ad3fb47, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 785.291188] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 785.291451] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 785.291620] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 789.438273] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquiring lock "b9025e22-8080-4887-8e4e-179866f704ca" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 789.438569] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Lock "b9025e22-8080-4887-8e4e-179866f704ca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 789.462186] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquiring lock "020c2b79-e755-4178-aa85-5ecaa31e7a9f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 789.462434] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Lock "020c2b79-e755-4178-aa85-5ecaa31e7a9f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 791.263491] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3e653e97-a08a-4410-b698-46ab1449ebdf tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Acquiring lock "65901b4a-42cf-4795-abc2-b0fea1f4fee7" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 794.014665] env[60722]: DEBUG oslo_concurrency.lockutils [None req-32d5ec17-d04e-4ae2-9085-4126161f478a tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquiring lock "bc2a1e45-2f48-4a73-bfee-69a20725a610" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 798.714855] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1007fb51-4fc1-495b-8b48-dd671d8b6373 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "e93b8d4b-6286-410a-870a-02fa7e59d90d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 799.256695] env[60722]: DEBUG oslo_concurrency.lockutils [None req-5e2d225f-09d7-47d4-9387-11bc06c658ac tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquiring lock "93268011-e1f2-4041-b4df-473c06d3f1eb" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 831.023238] env[60722]: WARNING oslo_vmware.rw_handles [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 831.023238] env[60722]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 831.023238] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 831.023238] env[60722]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 831.023238] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 831.023238] env[60722]: ERROR oslo_vmware.rw_handles response.begin() [ 831.023238] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 831.023238] env[60722]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 831.023238] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 831.023238] env[60722]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 831.023238] env[60722]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 831.023238] env[60722]: ERROR oslo_vmware.rw_handles [ 831.023965] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Downloaded image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to vmware_temp/0a15ee81-db1d-4fe8-8952-8730e44b02f4/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 831.025281] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Caching image {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 831.025510] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Copying Virtual Disk [datastore1] vmware_temp/0a15ee81-db1d-4fe8-8952-8730e44b02f4/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk to [datastore1] vmware_temp/0a15ee81-db1d-4fe8-8952-8730e44b02f4/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk {{(pid=60722) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 831.025780] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a0796a50-9083-499b-9e30-0b14e476f20a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 831.033360] env[60722]: DEBUG oslo_vmware.api [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Waiting for the task: (returnval){ [ 831.033360] env[60722]: value = "task-565173" [ 831.033360] env[60722]: _type = "Task" [ 831.033360] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 831.041063] env[60722]: DEBUG oslo_vmware.api [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Task: {'id': task-565173, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 831.543956] env[60722]: DEBUG oslo_vmware.exceptions [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Fault InvalidArgument not matched. {{(pid=60722) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 831.544234] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 831.544788] env[60722]: ERROR nova.compute.manager [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 831.544788] env[60722]: Faults: ['InvalidArgument'] [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Traceback (most recent call last): [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] yield resources [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] self.driver.spawn(context, instance, image_meta, [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] self._fetch_image_if_missing(context, vi) [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] image_cache(vi, tmp_image_ds_loc) [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] vm_util.copy_virtual_disk( [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] session._wait_for_task(vmdk_copy_task) [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] return self.wait_for_task(task_ref) [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] return evt.wait() [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] result = hub.switch() [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] return self.greenlet.switch() [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] self.f(*self.args, **self.kw) [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] raise exceptions.translate_fault(task_info.error) [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Faults: ['InvalidArgument'] [ 831.544788] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] [ 831.546013] env[60722]: INFO nova.compute.manager [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Terminating instance [ 831.547154] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 831.547409] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 831.548028] env[60722]: DEBUG nova.compute.manager [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 831.548219] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 831.548429] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ca272699-009f-47f5-adf6-b8d90f8bdb4f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 831.550996] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e834e218-994b-4fb9-8662-ea2b3bb21f5d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 831.557782] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 831.557960] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-89b5a02d-14ff-4f40-8bce-6b359b2c6321 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 831.560126] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 831.560286] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 831.561190] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f393d83e-2ae9-4574-af8b-3834e825b154 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 831.566100] env[60722]: DEBUG oslo_vmware.api [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Waiting for the task: (returnval){ [ 831.566100] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52aa437f-38f0-9099-5cd1-153412dfed23" [ 831.566100] env[60722]: _type = "Task" [ 831.566100] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 831.574956] env[60722]: DEBUG oslo_vmware.api [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52aa437f-38f0-9099-5cd1-153412dfed23, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 831.623079] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 831.623256] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 831.623436] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Deleting the datastore file [datastore1] 1b18a8e4-eab9-4f28-bd87-a354c436b51c {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 831.623692] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3befa834-996a-4a84-876d-b3ff8a8b5c04 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 831.629966] env[60722]: DEBUG oslo_vmware.api [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Waiting for the task: (returnval){ [ 831.629966] env[60722]: value = "task-565175" [ 831.629966] env[60722]: _type = "Task" [ 831.629966] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 831.638490] env[60722]: DEBUG oslo_vmware.api [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Task: {'id': task-565175, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 832.076390] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 832.076738] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Creating directory with path [datastore1] vmware_temp/11df168d-49b9-4ee1-941e-d7995ea7e75c/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 832.076832] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2c6c1445-8351-4d37-976f-d6da51629cfa {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 832.088167] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Created directory with path [datastore1] vmware_temp/11df168d-49b9-4ee1-941e-d7995ea7e75c/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 832.088389] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Fetch image to [datastore1] vmware_temp/11df168d-49b9-4ee1-941e-d7995ea7e75c/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 832.088596] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/11df168d-49b9-4ee1-941e-d7995ea7e75c/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 832.089337] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83ea3638-acf5-4db1-ab9a-135dfd540d5e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 832.095750] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-feebbd26-2256-4efc-8cfb-e4cb0964406c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 832.104237] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c928d4b-2de5-4766-9a60-4f8679834a91 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 832.137167] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cafab7a-aa20-4347-98aa-195ea6325c78 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 832.143751] env[60722]: DEBUG oslo_vmware.api [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Task: {'id': task-565175, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084668} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 832.145191] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 832.145375] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 832.145539] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 832.145716] env[60722]: INFO nova.compute.manager [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 832.147418] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3ace68e9-c0c1-4eda-aad5-7333a31be10d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 832.149583] env[60722]: DEBUG nova.compute.claims [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 832.149746] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 832.149950] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 832.171075] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 832.219012] env[60722]: DEBUG oslo_vmware.rw_handles [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/11df168d-49b9-4ee1-941e-d7995ea7e75c/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 832.275416] env[60722]: DEBUG oslo_vmware.rw_handles [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Completed reading data from the image iterator. {{(pid=60722) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 832.275596] env[60722]: DEBUG oslo_vmware.rw_handles [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/11df168d-49b9-4ee1-941e-d7995ea7e75c/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 832.476642] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91e3c7b7-b9c0-44df-8e1a-4e216fdfa74e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 832.485349] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8192368f-8740-46ad-9e16-66bc196c8ca4 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 832.515111] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed381d9e-fd95-4358-b36a-603aa6814de1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 832.522158] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0182e1d-6070-4c43-b96f-efce6769a97f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 832.536332] env[60722]: DEBUG nova.compute.provider_tree [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 832.544961] env[60722]: DEBUG nova.scheduler.client.report [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 832.559153] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.409s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 832.559312] env[60722]: ERROR nova.compute.manager [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 832.559312] env[60722]: Faults: ['InvalidArgument'] [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Traceback (most recent call last): [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] self.driver.spawn(context, instance, image_meta, [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] self._fetch_image_if_missing(context, vi) [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] image_cache(vi, tmp_image_ds_loc) [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] vm_util.copy_virtual_disk( [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] session._wait_for_task(vmdk_copy_task) [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] return self.wait_for_task(task_ref) [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] return evt.wait() [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] result = hub.switch() [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] return self.greenlet.switch() [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] self.f(*self.args, **self.kw) [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] raise exceptions.translate_fault(task_info.error) [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Faults: ['InvalidArgument'] [ 832.559312] env[60722]: ERROR nova.compute.manager [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] [ 832.560264] env[60722]: DEBUG nova.compute.utils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] VimFaultException {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 832.562066] env[60722]: DEBUG nova.compute.manager [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Build of instance 1b18a8e4-eab9-4f28-bd87-a354c436b51c was re-scheduled: A specified parameter was not correct: fileType [ 832.562066] env[60722]: Faults: ['InvalidArgument'] {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 832.562194] env[60722]: DEBUG nova.compute.manager [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 832.562306] env[60722]: DEBUG nova.compute.manager [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 832.562475] env[60722]: DEBUG nova.compute.manager [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 832.562640] env[60722]: DEBUG nova.network.neutron [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 832.856031] env[60722]: DEBUG nova.network.neutron [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 832.871072] env[60722]: INFO nova.compute.manager [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: 1b18a8e4-eab9-4f28-bd87-a354c436b51c] Took 0.31 seconds to deallocate network for instance. [ 832.967048] env[60722]: INFO nova.scheduler.client.report [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Deleted allocations for instance 1b18a8e4-eab9-4f28-bd87-a354c436b51c [ 832.981138] env[60722]: DEBUG oslo_concurrency.lockutils [None req-88a64a10-dd6a-4dab-be8e-60268ba56276 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "1b18a8e4-eab9-4f28-bd87-a354c436b51c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 240.803s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 833.001023] env[60722]: DEBUG nova.compute.manager [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 833.047152] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 833.047397] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 833.048764] env[60722]: INFO nova.compute.claims [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 833.288983] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69643144-027a-4239-bd07-2ce784bf315a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 833.296845] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8090f6ee-472e-4189-a538-854261b1f356 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 833.326248] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce2dedea-8047-44d6-af7b-9990dd9f267c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 833.332904] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44a67a79-4f35-4bc1-93b7-d588e3930da8 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 833.346069] env[60722]: DEBUG nova.compute.provider_tree [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 833.354487] env[60722]: DEBUG nova.scheduler.client.report [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 833.369111] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.322s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 833.378897] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Acquiring lock "661e9bad-0330-4499-8af6-f9d181e382ce" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 833.379127] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Lock "661e9bad-0330-4499-8af6-f9d181e382ce" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 833.383868] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Lock "661e9bad-0330-4499-8af6-f9d181e382ce" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: held 0.005s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 833.384314] env[60722]: DEBUG nova.compute.manager [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 833.414132] env[60722]: DEBUG nova.compute.utils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 833.415283] env[60722]: DEBUG nova.compute.manager [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 833.415451] env[60722]: DEBUG nova.network.neutron [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 833.423268] env[60722]: DEBUG nova.compute.manager [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 833.468524] env[60722]: DEBUG nova.policy [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e8ed004bafd34b05aa27c555cd65b781', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8496b203b3ed4853a7352050c92a7638', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 833.485314] env[60722]: DEBUG nova.compute.manager [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 833.506723] env[60722]: DEBUG nova.virt.hardware [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 833.506953] env[60722]: DEBUG nova.virt.hardware [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 833.507120] env[60722]: DEBUG nova.virt.hardware [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 833.507298] env[60722]: DEBUG nova.virt.hardware [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 833.507439] env[60722]: DEBUG nova.virt.hardware [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 833.507579] env[60722]: DEBUG nova.virt.hardware [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 833.507807] env[60722]: DEBUG nova.virt.hardware [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 833.507992] env[60722]: DEBUG nova.virt.hardware [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 833.508256] env[60722]: DEBUG nova.virt.hardware [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 833.508516] env[60722]: DEBUG nova.virt.hardware [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 833.508585] env[60722]: DEBUG nova.virt.hardware [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 833.509443] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b9aeca8-35eb-4f5b-ae14-0d4edb55dd1f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 833.517417] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f3fa4cb-5528-4137-bca5-958b9f4fe52e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 833.863379] env[60722]: DEBUG nova.network.neutron [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Successfully created port: 2cda1401-efd0-4fa9-91d2-b2bfa41396c7 {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 833.944149] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 833.944320] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Cleaning up deleted instances {{(pid=60722) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11095}} [ 833.965562] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] There are 0 instances to clean {{(pid=60722) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11104}} [ 833.965781] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 833.965917] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Cleaning up deleted instances with incomplete migration {{(pid=60722) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11133}} [ 833.977917] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 834.523130] env[60722]: DEBUG nova.network.neutron [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Successfully updated port: 2cda1401-efd0-4fa9-91d2-b2bfa41396c7 {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 834.536203] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Acquiring lock "refresh_cache-2786801d-6211-4598-b357-4f0a0ffdd7d1" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 834.536203] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Acquired lock "refresh_cache-2786801d-6211-4598-b357-4f0a0ffdd7d1" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 834.536203] env[60722]: DEBUG nova.network.neutron [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 834.565532] env[60722]: DEBUG nova.network.neutron [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 834.918275] env[60722]: DEBUG nova.network.neutron [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Updating instance_info_cache with network_info: [{"id": "2cda1401-efd0-4fa9-91d2-b2bfa41396c7", "address": "fa:16:3e:87:92:61", "network": {"id": "f6cc3b76-d482-446e-a05b-b72596952a38", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-254634811-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8496b203b3ed4853a7352050c92a7638", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "05b1253d-2b87-4158-9ff1-dafcf829f11f", "external-id": "nsx-vlan-transportzone-55", "segmentation_id": 55, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2cda1401-ef", "ovs_interfaceid": "2cda1401-efd0-4fa9-91d2-b2bfa41396c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 834.929198] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Releasing lock "refresh_cache-2786801d-6211-4598-b357-4f0a0ffdd7d1" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 834.929519] env[60722]: DEBUG nova.compute.manager [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Instance network_info: |[{"id": "2cda1401-efd0-4fa9-91d2-b2bfa41396c7", "address": "fa:16:3e:87:92:61", "network": {"id": "f6cc3b76-d482-446e-a05b-b72596952a38", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-254634811-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8496b203b3ed4853a7352050c92a7638", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "05b1253d-2b87-4158-9ff1-dafcf829f11f", "external-id": "nsx-vlan-transportzone-55", "segmentation_id": 55, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2cda1401-ef", "ovs_interfaceid": "2cda1401-efd0-4fa9-91d2-b2bfa41396c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 834.929873] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:87:92:61', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '05b1253d-2b87-4158-9ff1-dafcf829f11f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2cda1401-efd0-4fa9-91d2-b2bfa41396c7', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 834.937470] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Creating folder: Project (8496b203b3ed4853a7352050c92a7638). Parent ref: group-v141606. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 834.937948] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-74786cf4-14b1-48e5-9d64-e58ff9806ebe {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 834.949269] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Created folder: Project (8496b203b3ed4853a7352050c92a7638) in parent group-v141606. [ 834.951531] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Creating folder: Instances. Parent ref: group-v141647. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 834.951531] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-489782d8-5a94-4acc-9d80-5a03ec9042b8 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 834.960308] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Created folder: Instances in parent group-v141647. [ 834.961035] env[60722]: DEBUG oslo.service.loopingcall [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 834.961035] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 834.961035] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9ce65c01-064f-4c60-8d5b-69c73d7a8084 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 834.977221] env[60722]: DEBUG nova.compute.manager [req-cf2af709-21ae-4135-b0a5-e987ca018b64 req-8315079f-bbbc-4ac1-9d86-021d5f3f021d service nova] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Received event network-vif-plugged-2cda1401-efd0-4fa9-91d2-b2bfa41396c7 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 834.977403] env[60722]: DEBUG oslo_concurrency.lockutils [req-cf2af709-21ae-4135-b0a5-e987ca018b64 req-8315079f-bbbc-4ac1-9d86-021d5f3f021d service nova] Acquiring lock "2786801d-6211-4598-b357-4f0a0ffdd7d1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 834.977987] env[60722]: DEBUG oslo_concurrency.lockutils [req-cf2af709-21ae-4135-b0a5-e987ca018b64 req-8315079f-bbbc-4ac1-9d86-021d5f3f021d service nova] Lock "2786801d-6211-4598-b357-4f0a0ffdd7d1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 834.977987] env[60722]: DEBUG oslo_concurrency.lockutils [req-cf2af709-21ae-4135-b0a5-e987ca018b64 req-8315079f-bbbc-4ac1-9d86-021d5f3f021d service nova] Lock "2786801d-6211-4598-b357-4f0a0ffdd7d1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 834.977987] env[60722]: DEBUG nova.compute.manager [req-cf2af709-21ae-4135-b0a5-e987ca018b64 req-8315079f-bbbc-4ac1-9d86-021d5f3f021d service nova] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] No waiting events found dispatching network-vif-plugged-2cda1401-efd0-4fa9-91d2-b2bfa41396c7 {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 834.978174] env[60722]: WARNING nova.compute.manager [req-cf2af709-21ae-4135-b0a5-e987ca018b64 req-8315079f-bbbc-4ac1-9d86-021d5f3f021d service nova] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Received unexpected event network-vif-plugged-2cda1401-efd0-4fa9-91d2-b2bfa41396c7 for instance with vm_state building and task_state spawning. [ 834.978280] env[60722]: DEBUG nova.compute.manager [req-cf2af709-21ae-4135-b0a5-e987ca018b64 req-8315079f-bbbc-4ac1-9d86-021d5f3f021d service nova] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Received event network-changed-2cda1401-efd0-4fa9-91d2-b2bfa41396c7 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 834.978483] env[60722]: DEBUG nova.compute.manager [req-cf2af709-21ae-4135-b0a5-e987ca018b64 req-8315079f-bbbc-4ac1-9d86-021d5f3f021d service nova] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Refreshing instance network info cache due to event network-changed-2cda1401-efd0-4fa9-91d2-b2bfa41396c7. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 834.978593] env[60722]: DEBUG oslo_concurrency.lockutils [req-cf2af709-21ae-4135-b0a5-e987ca018b64 req-8315079f-bbbc-4ac1-9d86-021d5f3f021d service nova] Acquiring lock "refresh_cache-2786801d-6211-4598-b357-4f0a0ffdd7d1" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 834.978718] env[60722]: DEBUG oslo_concurrency.lockutils [req-cf2af709-21ae-4135-b0a5-e987ca018b64 req-8315079f-bbbc-4ac1-9d86-021d5f3f021d service nova] Acquired lock "refresh_cache-2786801d-6211-4598-b357-4f0a0ffdd7d1" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 834.978863] env[60722]: DEBUG nova.network.neutron [req-cf2af709-21ae-4135-b0a5-e987ca018b64 req-8315079f-bbbc-4ac1-9d86-021d5f3f021d service nova] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Refreshing network info cache for port 2cda1401-efd0-4fa9-91d2-b2bfa41396c7 {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 834.981067] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 834.981067] env[60722]: value = "task-565178" [ 834.981067] env[60722]: _type = "Task" [ 834.981067] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 834.991021] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565178, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 835.332642] env[60722]: DEBUG nova.network.neutron [req-cf2af709-21ae-4135-b0a5-e987ca018b64 req-8315079f-bbbc-4ac1-9d86-021d5f3f021d service nova] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Updated VIF entry in instance network info cache for port 2cda1401-efd0-4fa9-91d2-b2bfa41396c7. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 835.333030] env[60722]: DEBUG nova.network.neutron [req-cf2af709-21ae-4135-b0a5-e987ca018b64 req-8315079f-bbbc-4ac1-9d86-021d5f3f021d service nova] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Updating instance_info_cache with network_info: [{"id": "2cda1401-efd0-4fa9-91d2-b2bfa41396c7", "address": "fa:16:3e:87:92:61", "network": {"id": "f6cc3b76-d482-446e-a05b-b72596952a38", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-254634811-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8496b203b3ed4853a7352050c92a7638", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "05b1253d-2b87-4158-9ff1-dafcf829f11f", "external-id": "nsx-vlan-transportzone-55", "segmentation_id": 55, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2cda1401-ef", "ovs_interfaceid": "2cda1401-efd0-4fa9-91d2-b2bfa41396c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 835.342284] env[60722]: DEBUG oslo_concurrency.lockutils [req-cf2af709-21ae-4135-b0a5-e987ca018b64 req-8315079f-bbbc-4ac1-9d86-021d5f3f021d service nova] Releasing lock "refresh_cache-2786801d-6211-4598-b357-4f0a0ffdd7d1" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 835.491292] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565178, 'name': CreateVM_Task, 'duration_secs': 0.287203} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 835.491292] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 835.491867] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 835.492036] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 835.492353] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 835.492572] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0e8dd2d8-2fc5-41b9-9669-0c32c735acf9 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 835.496893] env[60722]: DEBUG oslo_vmware.api [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Waiting for the task: (returnval){ [ 835.496893] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52cbcf35-f55e-b08a-c005-58b59eeb83b5" [ 835.496893] env[60722]: _type = "Task" [ 835.496893] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 835.504025] env[60722]: DEBUG oslo_vmware.api [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52cbcf35-f55e-b08a-c005-58b59eeb83b5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 835.981062] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 835.981321] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 835.981480] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 835.981623] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60722) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 836.008125] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 836.008356] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 836.008559] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 836.944534] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 837.944487] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 837.944839] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 838.945360] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 838.945667] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Starting heal instance info cache {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 838.945667] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Rebuilding the list of instances to heal {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 838.964466] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 838.964613] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 838.964827] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 838.964974] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 838.965113] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 838.965233] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 838.965351] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 838.965468] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 838.965583] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 838.965696] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 838.965812] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Didn't find any instances for network info cache update. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 838.966250] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 839.943774] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 839.953541] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 839.953838] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 839.954120] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 839.954120] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60722) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 839.955119] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b005cc56-0b84-4292-b146-b192fe68ce37 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 839.963895] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51c445dd-df1e-4af7-a9f0-8b8c4476ff7c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 839.977249] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b629cdf-d27c-4964-bf39-dd6525089a45 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 839.983270] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-850dd78c-53ab-4070-af58-e6745e81105e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 840.012573] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181730MB free_disk=100GB free_vcpus=48 pci_devices=None {{(pid=60722) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 840.012723] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 840.012952] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 840.128991] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 65901b4a-42cf-4795-abc2-b0fea1f4fee7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 840.129206] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance bfde3558-9940-4402-bdf9-15c23c285a8f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 840.129343] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance bc2a1e45-2f48-4a73-bfee-69a20725a610 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 840.129468] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance e93b8d4b-6286-410a-870a-02fa7e59d90d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 840.129584] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 93268011-e1f2-4041-b4df-473c06d3f1eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 840.129699] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance eae8d9ce-9fe3-411e-9fd8-05920fb0af04 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 840.129813] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 4e66f1dc-18c6-4d64-9bbe-9b061e795a65 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 840.129927] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 22463917-2185-42f7-87b7-2b720be45c22 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 840.130048] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 1c4b8597-88ec-4e79-a749-f802803a5ffe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 840.130163] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 2786801d-6211-4598-b357-4f0a0ffdd7d1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 840.141280] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 019db29d-b8e4-4592-b7c4-2c044e2b2a51 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 840.151689] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 151df220-ca11-4455-b620-f8fe5a1be5b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 840.161334] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 47f34953-a5e3-4b5e-9164-ec5980802298 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 840.170893] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 7f1d5c92-ea40-4ad1-b669-877028b69711 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 840.180190] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance ecade932-ccf5-4a5c-8348-6d88a311f3a1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 840.189776] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance f60a4fc6-1a93-4f54-8b34-71aa0a2e035c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 840.203515] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 22eaddb2-108d-44e4-8d9d-85fc91efa9f9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 840.213014] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 825f4673-e46f-4b32-a3d8-4bb163ac2390 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 840.222372] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance b9025e22-8080-4887-8e4e-179866f704ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 840.231856] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 020c2b79-e755-4178-aa85-5ecaa31e7a9f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 840.232095] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 840.232244] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1856MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 840.247976] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Refreshing inventories for resource provider 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 840.262176] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Updating ProviderTree inventory for provider 6d7f336b-9351-4171-8197-866cdafbab42 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 840.262361] env[60722]: DEBUG nova.compute.provider_tree [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Updating inventory in ProviderTree for provider 6d7f336b-9351-4171-8197-866cdafbab42 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 840.273598] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Refreshing aggregate associations for resource provider 6d7f336b-9351-4171-8197-866cdafbab42, aggregates: None {{(pid=60722) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 840.289545] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Refreshing trait associations for resource provider 6d7f336b-9351-4171-8197-866cdafbab42, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SAME_HOST_COLD_MIGRATE {{(pid=60722) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 840.519015] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95495cc4-cd25-46c4-b94a-67bd4fddf4b6 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 840.527100] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1a3934f-d76e-40be-bb9a-80391ce476f7 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 840.556783] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21ca47b4-4044-4db0-8f4a-8a74eff491a4 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 840.564102] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba8ed760-4cac-4952-8573-95f14f990422 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 840.576857] env[60722]: DEBUG nova.compute.provider_tree [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 840.585327] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 840.603697] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60722) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 840.603908] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.591s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 864.988774] env[60722]: DEBUG nova.compute.manager [req-6e53d252-83b4-4e12-b1a5-1688927f1a61 req-24d735c1-e5f7-4c57-a9a1-2900c85462a8 service nova] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Received event network-vif-deleted-5c9fb706-c952-40fe-9e82-4d1ababaeea1 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 875.894554] env[60722]: DEBUG nova.compute.manager [req-ebe86174-898b-4a6a-8cd9-b2c3adc25a0f req-09adebc4-c2e7-4b91-b262-c1798b35f06d service nova] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Received event network-vif-deleted-27afab40-1b84-4089-973a-32c8164be535 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 878.104163] env[60722]: DEBUG nova.compute.manager [req-bbd61242-baaa-42e9-b329-d091a454027b req-fb6d0bd6-3232-4d4d-8f88-1e3ebc1fad25 service nova] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Received event network-vif-deleted-b47f7eca-cc89-45db-bb41-ae0dcb7e8f3d {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 878.179667] env[60722]: WARNING oslo_vmware.rw_handles [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 878.179667] env[60722]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 878.179667] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 878.179667] env[60722]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 878.179667] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 878.179667] env[60722]: ERROR oslo_vmware.rw_handles response.begin() [ 878.179667] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 878.179667] env[60722]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 878.179667] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 878.179667] env[60722]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 878.179667] env[60722]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 878.179667] env[60722]: ERROR oslo_vmware.rw_handles [ 878.180402] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Downloaded image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to vmware_temp/11df168d-49b9-4ee1-941e-d7995ea7e75c/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 878.181694] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Caching image {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 878.181926] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Copying Virtual Disk [datastore1] vmware_temp/11df168d-49b9-4ee1-941e-d7995ea7e75c/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk to [datastore1] vmware_temp/11df168d-49b9-4ee1-941e-d7995ea7e75c/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk {{(pid=60722) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 878.182971] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-980cbd5c-027a-414c-920e-bac1355d6e09 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.193335] env[60722]: DEBUG oslo_vmware.api [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Waiting for the task: (returnval){ [ 878.193335] env[60722]: value = "task-565179" [ 878.193335] env[60722]: _type = "Task" [ 878.193335] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 878.204426] env[60722]: DEBUG oslo_vmware.api [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Task: {'id': task-565179, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 878.614212] env[60722]: DEBUG nova.compute.manager [req-57f73c88-1c6c-4de9-864b-4d40c216929d req-60de87d9-1881-4d40-88ea-6d07ed0c71af service nova] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Received event network-vif-deleted-3493a62c-0ddc-4402-9192-74f0d77f00d2 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 878.710677] env[60722]: DEBUG oslo_vmware.exceptions [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Fault InvalidArgument not matched. {{(pid=60722) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 878.710963] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 878.711530] env[60722]: ERROR nova.compute.manager [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 878.711530] env[60722]: Faults: ['InvalidArgument'] [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Traceback (most recent call last): [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] yield resources [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] self.driver.spawn(context, instance, image_meta, [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] self._fetch_image_if_missing(context, vi) [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] image_cache(vi, tmp_image_ds_loc) [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] vm_util.copy_virtual_disk( [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] session._wait_for_task(vmdk_copy_task) [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] return self.wait_for_task(task_ref) [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] return evt.wait() [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] result = hub.switch() [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] return self.greenlet.switch() [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] self.f(*self.args, **self.kw) [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] raise exceptions.translate_fault(task_info.error) [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Faults: ['InvalidArgument'] [ 878.711530] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] [ 878.712483] env[60722]: INFO nova.compute.manager [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Terminating instance [ 878.713541] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 878.713658] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 878.714327] env[60722]: DEBUG nova.compute.manager [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 878.714462] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 878.714677] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0a454a91-75ce-427d-9327-bcbbfec1dbc7 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.717785] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-141fc22e-6a92-43ea-a945-a4a230272bb4 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.725106] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 878.725106] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-fdc43a3d-7bde-4e1a-8340-bfd0263b9d58 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.726911] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 878.727348] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 878.728086] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8c55f1ab-c93c-44cb-90f9-e4256e9ccf10 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.733775] env[60722]: DEBUG oslo_vmware.api [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Waiting for the task: (returnval){ [ 878.733775] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52514148-464c-85c2-166b-001d0a91cbc1" [ 878.733775] env[60722]: _type = "Task" [ 878.733775] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 878.744115] env[60722]: DEBUG oslo_vmware.api [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52514148-464c-85c2-166b-001d0a91cbc1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 878.798178] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 878.798398] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 878.798764] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Deleting the datastore file [datastore1] bfde3558-9940-4402-bdf9-15c23c285a8f {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 878.798848] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e8e3905e-2c2c-4b71-af1c-a9d5d4a54815 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.806363] env[60722]: DEBUG oslo_vmware.api [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Waiting for the task: (returnval){ [ 878.806363] env[60722]: value = "task-565181" [ 878.806363] env[60722]: _type = "Task" [ 878.806363] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 878.813722] env[60722]: DEBUG oslo_vmware.api [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Task: {'id': task-565181, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 879.244850] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 879.245451] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Creating directory with path [datastore1] vmware_temp/d7cc4497-d4e4-47c4-b940-77fda30aecb5/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 879.245713] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5dbf164d-6feb-44d1-a362-6eaf236fde70 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 879.256925] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Created directory with path [datastore1] vmware_temp/d7cc4497-d4e4-47c4-b940-77fda30aecb5/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 879.257161] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Fetch image to [datastore1] vmware_temp/d7cc4497-d4e4-47c4-b940-77fda30aecb5/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 879.257304] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/d7cc4497-d4e4-47c4-b940-77fda30aecb5/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 879.258034] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb8ca1ec-665a-4c4d-8044-debe1c751d66 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 879.265052] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5608e57-3e9c-495b-b70f-7272e730c882 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 879.274220] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d1930f1-0486-4493-8c8b-eedfbf90c20e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 879.306057] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7eccdf6-0ee5-4af4-9216-ef2e5ac6068a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 879.318559] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-233ddc90-075d-4e3e-9107-04547fbbb8db {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 879.320519] env[60722]: DEBUG oslo_vmware.api [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Task: {'id': task-565181, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068796} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 879.320853] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 879.320939] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 879.321103] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 879.321450] env[60722]: INFO nova.compute.manager [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Took 0.61 seconds to destroy the instance on the hypervisor. [ 879.325563] env[60722]: DEBUG nova.compute.claims [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 879.325563] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 879.325563] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 879.343786] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 879.540259] env[60722]: DEBUG oslo_vmware.rw_handles [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d7cc4497-d4e4-47c4-b940-77fda30aecb5/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 879.604955] env[60722]: DEBUG oslo_vmware.rw_handles [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Completed reading data from the image iterator. {{(pid=60722) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 879.605149] env[60722]: DEBUG oslo_vmware.rw_handles [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d7cc4497-d4e4-47c4-b940-77fda30aecb5/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 879.694916] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09cab938-8831-4c56-aaca-c39f8fb05e2a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 879.704980] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f124856b-ccf1-4f7e-a8d4-65ed721cb962 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 879.736742] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f82b5f8-d8af-48c6-80e7-8f93c2671d13 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 879.745010] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c02d54e-691c-42a8-b4a1-2704e0d8b058 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 879.758900] env[60722]: DEBUG nova.compute.provider_tree [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 879.769216] env[60722]: DEBUG nova.scheduler.client.report [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 879.788252] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.463s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 879.788615] env[60722]: ERROR nova.compute.manager [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 879.788615] env[60722]: Faults: ['InvalidArgument'] [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Traceback (most recent call last): [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] self.driver.spawn(context, instance, image_meta, [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] self._fetch_image_if_missing(context, vi) [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] image_cache(vi, tmp_image_ds_loc) [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] vm_util.copy_virtual_disk( [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] session._wait_for_task(vmdk_copy_task) [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] return self.wait_for_task(task_ref) [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] return evt.wait() [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] result = hub.switch() [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] return self.greenlet.switch() [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] self.f(*self.args, **self.kw) [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] raise exceptions.translate_fault(task_info.error) [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Faults: ['InvalidArgument'] [ 879.788615] env[60722]: ERROR nova.compute.manager [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] [ 879.790201] env[60722]: DEBUG nova.compute.utils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] VimFaultException {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 879.791823] env[60722]: DEBUG nova.compute.manager [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Build of instance bfde3558-9940-4402-bdf9-15c23c285a8f was re-scheduled: A specified parameter was not correct: fileType [ 879.791823] env[60722]: Faults: ['InvalidArgument'] {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 879.792263] env[60722]: DEBUG nova.compute.manager [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 879.792476] env[60722]: DEBUG nova.compute.manager [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 879.792661] env[60722]: DEBUG nova.compute.manager [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 879.792868] env[60722]: DEBUG nova.network.neutron [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 880.222978] env[60722]: DEBUG nova.compute.manager [req-d1a635a8-88e4-4bee-bc67-9d5470af8243 req-4bd9abf3-4c16-4bcd-913d-63fa9a167136 service nova] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Received event network-vif-deleted-5c5a23cd-e91f-4ba3-8aac-d344adde8784 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 880.256113] env[60722]: DEBUG nova.network.neutron [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 880.275038] env[60722]: INFO nova.compute.manager [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: bfde3558-9940-4402-bdf9-15c23c285a8f] Took 0.48 seconds to deallocate network for instance. [ 880.419019] env[60722]: INFO nova.scheduler.client.report [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Deleted allocations for instance bfde3558-9940-4402-bdf9-15c23c285a8f [ 880.441476] env[60722]: DEBUG oslo_concurrency.lockutils [None req-0749cabf-9b7f-42d5-9e4c-2618dabe2e24 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Lock "bfde3558-9940-4402-bdf9-15c23c285a8f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 286.503s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 880.467124] env[60722]: DEBUG nova.compute.manager [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 880.542487] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 880.542726] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 880.544524] env[60722]: INFO nova.compute.claims [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 880.890133] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cd0e844-31c7-4d0b-b702-3d079a9a143d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 880.902339] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ab6235f-aeb1-4290-8e45-2b38a3328008 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 880.942171] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06268b2c-6a7f-4b47-b361-0e4b00a48659 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 880.950032] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e506003-7829-4681-9a06-416ec194aaff {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 880.970216] env[60722]: DEBUG nova.compute.provider_tree [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 880.987590] env[60722]: DEBUG nova.scheduler.client.report [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 881.010107] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.467s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 881.010684] env[60722]: DEBUG nova.compute.manager [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 881.078406] env[60722]: DEBUG nova.compute.utils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 881.079999] env[60722]: DEBUG nova.compute.manager [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 881.080200] env[60722]: DEBUG nova.network.neutron [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 881.099034] env[60722]: DEBUG nova.compute.manager [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 881.172639] env[60722]: DEBUG nova.policy [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '856ad1b1a530413880e8814555fa2bf7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ba96dde0c24d4265aa973b0c8f7573eb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 881.181785] env[60722]: DEBUG nova.compute.manager [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 881.214234] env[60722]: DEBUG nova.virt.hardware [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 881.214234] env[60722]: DEBUG nova.virt.hardware [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 881.215175] env[60722]: DEBUG nova.virt.hardware [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 881.215175] env[60722]: DEBUG nova.virt.hardware [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 881.215175] env[60722]: DEBUG nova.virt.hardware [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 881.215175] env[60722]: DEBUG nova.virt.hardware [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 881.215175] env[60722]: DEBUG nova.virt.hardware [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 881.215425] env[60722]: DEBUG nova.virt.hardware [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 881.215606] env[60722]: DEBUG nova.virt.hardware [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 881.216753] env[60722]: DEBUG nova.virt.hardware [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 881.216753] env[60722]: DEBUG nova.virt.hardware [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 881.216753] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a945e77-befe-4a6d-ba5e-b201784c06f7 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 881.225700] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1027e77d-ed2d-4bf4-ace4-4ef077615242 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 882.047565] env[60722]: DEBUG nova.network.neutron [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Successfully created port: 5b4ad86f-0472-4191-873d-4caa9cd528de {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 882.990630] env[60722]: DEBUG nova.network.neutron [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Successfully updated port: 5b4ad86f-0472-4191-873d-4caa9cd528de {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 883.000927] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Acquiring lock "refresh_cache-019db29d-b8e4-4592-b7c4-2c044e2b2a51" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 883.001358] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Acquired lock "refresh_cache-019db29d-b8e4-4592-b7c4-2c044e2b2a51" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 883.001585] env[60722]: DEBUG nova.network.neutron [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 883.085364] env[60722]: DEBUG nova.network.neutron [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 883.444481] env[60722]: DEBUG nova.network.neutron [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Updating instance_info_cache with network_info: [{"id": "5b4ad86f-0472-4191-873d-4caa9cd528de", "address": "fa:16:3e:b0:8e:28", "network": {"id": "5296096a-cafd-4fc9-926d-91a5234636e2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-284984084-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ba96dde0c24d4265aa973b0c8f7573eb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "35e463c7-7d78-4d66-8efd-6127b1f3ee17", "external-id": "nsx-vlan-transportzone-175", "segmentation_id": 175, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5b4ad86f-04", "ovs_interfaceid": "5b4ad86f-0472-4191-873d-4caa9cd528de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 883.466267] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Releasing lock "refresh_cache-019db29d-b8e4-4592-b7c4-2c044e2b2a51" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 883.467406] env[60722]: DEBUG nova.compute.manager [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Instance network_info: |[{"id": "5b4ad86f-0472-4191-873d-4caa9cd528de", "address": "fa:16:3e:b0:8e:28", "network": {"id": "5296096a-cafd-4fc9-926d-91a5234636e2", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-284984084-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ba96dde0c24d4265aa973b0c8f7573eb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "35e463c7-7d78-4d66-8efd-6127b1f3ee17", "external-id": "nsx-vlan-transportzone-175", "segmentation_id": 175, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5b4ad86f-04", "ovs_interfaceid": "5b4ad86f-0472-4191-873d-4caa9cd528de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 883.469019] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b0:8e:28', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '35e463c7-7d78-4d66-8efd-6127b1f3ee17', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5b4ad86f-0472-4191-873d-4caa9cd528de', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 883.479132] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Creating folder: Project (ba96dde0c24d4265aa973b0c8f7573eb). Parent ref: group-v141606. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 883.480303] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c5ed144e-8a6f-456f-aa1a-909e707565dd {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 883.492222] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Created folder: Project (ba96dde0c24d4265aa973b0c8f7573eb) in parent group-v141606. [ 883.492396] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Creating folder: Instances. Parent ref: group-v141650. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 883.492971] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-32fff9d7-5d49-45e7-aba6-20a2ad766b71 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 883.502564] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Created folder: Instances in parent group-v141650. [ 883.502564] env[60722]: DEBUG oslo.service.loopingcall [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 883.502702] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 883.503268] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3756f214-8b82-4c32-9093-d87422846b62 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 883.523363] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 883.523363] env[60722]: value = "task-565184" [ 883.523363] env[60722]: _type = "Task" [ 883.523363] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 883.531625] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565184, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 883.658689] env[60722]: DEBUG nova.compute.manager [req-9c510de4-6269-4b1a-b95f-045f47406bca req-24523ce2-7186-473f-95b9-d7a1a3321c90 service nova] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Received event network-vif-plugged-5b4ad86f-0472-4191-873d-4caa9cd528de {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 883.658959] env[60722]: DEBUG oslo_concurrency.lockutils [req-9c510de4-6269-4b1a-b95f-045f47406bca req-24523ce2-7186-473f-95b9-d7a1a3321c90 service nova] Acquiring lock "019db29d-b8e4-4592-b7c4-2c044e2b2a51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 883.659240] env[60722]: DEBUG oslo_concurrency.lockutils [req-9c510de4-6269-4b1a-b95f-045f47406bca req-24523ce2-7186-473f-95b9-d7a1a3321c90 service nova] Lock "019db29d-b8e4-4592-b7c4-2c044e2b2a51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 883.659539] env[60722]: DEBUG oslo_concurrency.lockutils [req-9c510de4-6269-4b1a-b95f-045f47406bca req-24523ce2-7186-473f-95b9-d7a1a3321c90 service nova] Lock "019db29d-b8e4-4592-b7c4-2c044e2b2a51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 883.659784] env[60722]: DEBUG nova.compute.manager [req-9c510de4-6269-4b1a-b95f-045f47406bca req-24523ce2-7186-473f-95b9-d7a1a3321c90 service nova] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] No waiting events found dispatching network-vif-plugged-5b4ad86f-0472-4191-873d-4caa9cd528de {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 883.659964] env[60722]: WARNING nova.compute.manager [req-9c510de4-6269-4b1a-b95f-045f47406bca req-24523ce2-7186-473f-95b9-d7a1a3321c90 service nova] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Received unexpected event network-vif-plugged-5b4ad86f-0472-4191-873d-4caa9cd528de for instance with vm_state deleted and task_state None. [ 883.810882] env[60722]: DEBUG nova.compute.manager [req-147bdd87-7cde-46d9-ba85-f90a84a5418f req-315621a5-2576-42fe-bf0f-92ffe363c8ca service nova] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Received event network-vif-deleted-5b4ad86f-0472-4191-873d-4caa9cd528de {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 884.034932] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565184, 'name': CreateVM_Task, 'duration_secs': 0.289679} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 884.035320] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 884.035846] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 884.036191] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 884.036352] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 884.036599] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4142de61-c058-4d54-8511-b12b425ac3d5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 884.043810] env[60722]: DEBUG oslo_vmware.api [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Waiting for the task: (returnval){ [ 884.043810] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52d92fc4-f67a-a8e8-1fae-bfaaea4d5d37" [ 884.043810] env[60722]: _type = "Task" [ 884.043810] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 884.055154] env[60722]: DEBUG oslo_vmware.api [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52d92fc4-f67a-a8e8-1fae-bfaaea4d5d37, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 884.555154] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 884.555620] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 884.555837] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 885.986296] env[60722]: DEBUG nova.compute.manager [req-d1570d4f-8ced-4931-a653-1d6c2e358996 req-d57dc074-8d9c-47e5-a199-8118715fe6e7 service nova] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Received event network-changed-5b4ad86f-0472-4191-873d-4caa9cd528de {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 885.986818] env[60722]: DEBUG nova.compute.manager [req-d1570d4f-8ced-4931-a653-1d6c2e358996 req-d57dc074-8d9c-47e5-a199-8118715fe6e7 service nova] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Refreshing instance network info cache due to event network-changed-5b4ad86f-0472-4191-873d-4caa9cd528de. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 885.986818] env[60722]: DEBUG oslo_concurrency.lockutils [req-d1570d4f-8ced-4931-a653-1d6c2e358996 req-d57dc074-8d9c-47e5-a199-8118715fe6e7 service nova] Acquiring lock "refresh_cache-019db29d-b8e4-4592-b7c4-2c044e2b2a51" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 885.986818] env[60722]: DEBUG oslo_concurrency.lockutils [req-d1570d4f-8ced-4931-a653-1d6c2e358996 req-d57dc074-8d9c-47e5-a199-8118715fe6e7 service nova] Acquired lock "refresh_cache-019db29d-b8e4-4592-b7c4-2c044e2b2a51" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 885.987186] env[60722]: DEBUG nova.network.neutron [req-d1570d4f-8ced-4931-a653-1d6c2e358996 req-d57dc074-8d9c-47e5-a199-8118715fe6e7 service nova] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Refreshing network info cache for port 5b4ad86f-0472-4191-873d-4caa9cd528de {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 886.021185] env[60722]: DEBUG nova.network.neutron [req-d1570d4f-8ced-4931-a653-1d6c2e358996 req-d57dc074-8d9c-47e5-a199-8118715fe6e7 service nova] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 886.554585] env[60722]: DEBUG nova.network.neutron [req-d1570d4f-8ced-4931-a653-1d6c2e358996 req-d57dc074-8d9c-47e5-a199-8118715fe6e7 service nova] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Instance is deleted, no further info cache update {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:106}} [ 886.554912] env[60722]: DEBUG oslo_concurrency.lockutils [req-d1570d4f-8ced-4931-a653-1d6c2e358996 req-d57dc074-8d9c-47e5-a199-8118715fe6e7 service nova] Releasing lock "refresh_cache-019db29d-b8e4-4592-b7c4-2c044e2b2a51" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 886.555234] env[60722]: DEBUG nova.compute.manager [req-d1570d4f-8ced-4931-a653-1d6c2e358996 req-d57dc074-8d9c-47e5-a199-8118715fe6e7 service nova] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Received event network-vif-deleted-2cda1401-efd0-4fa9-91d2-b2bfa41396c7 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 896.605946] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 896.945842] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 896.945842] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 896.945842] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60722) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 897.943052] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 898.944577] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 898.944966] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 899.945542] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 900.940541] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 900.956374] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 900.956668] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Starting heal instance info cache {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 900.956710] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Rebuilding the list of instances to heal {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 900.970207] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 900.970362] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 900.970495] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 900.970621] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 900.970870] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Didn't find any instances for network info cache update. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 900.971313] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 900.981083] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 900.981293] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 900.981717] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 900.981900] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60722) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 900.982962] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6cf9106-fc5a-4d1e-baf0-7bdb449f2ca0 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.991708] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ee2794e-1b90-4e1c-a03b-7c427ab1aa8b {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 901.007019] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc4f6a71-ff59-497a-8b3c-ec3800165fdc {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 901.012313] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4bb9e86-2b41-45dc-9eaf-0cfc68e031db {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 901.044331] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181705MB free_disk=100GB free_vcpus=48 pci_devices=None {{(pid=60722) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 901.044494] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 901.044682] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 901.107590] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 65901b4a-42cf-4795-abc2-b0fea1f4fee7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 901.107739] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance bc2a1e45-2f48-4a73-bfee-69a20725a610 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 901.107863] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance e93b8d4b-6286-410a-870a-02fa7e59d90d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 901.107984] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 93268011-e1f2-4041-b4df-473c06d3f1eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 901.120378] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance b9025e22-8080-4887-8e4e-179866f704ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 901.130558] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 020c2b79-e755-4178-aa85-5ecaa31e7a9f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 901.130786] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 901.130940] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1088MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 901.209560] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-000d8f78-730a-4e5c-a1c2-7fd31d7769e8 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 901.217010] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cab0a65a-bd18-4d55-877e-54694bbe8092 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 901.248443] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-075ce9c7-dbef-4052-a7b6-36d0cd40f1f6 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 901.255924] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f94075e-de68-479d-8ef0-6a114dc862b7 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 901.271132] env[60722]: DEBUG nova.compute.provider_tree [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 901.281828] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 901.296978] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60722) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 901.297403] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.253s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 905.754469] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Acquiring lock "8d78f310-a2f2-4073-8371-afc42cc566f2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 905.754789] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Lock "8d78f310-a2f2-4073-8371-afc42cc566f2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 928.197226] env[60722]: WARNING oslo_vmware.rw_handles [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 928.197226] env[60722]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 928.197226] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 928.197226] env[60722]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 928.197226] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 928.197226] env[60722]: ERROR oslo_vmware.rw_handles response.begin() [ 928.197226] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 928.197226] env[60722]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 928.197226] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 928.197226] env[60722]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 928.197226] env[60722]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 928.197226] env[60722]: ERROR oslo_vmware.rw_handles [ 928.197977] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Downloaded image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to vmware_temp/d7cc4497-d4e4-47c4-b940-77fda30aecb5/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 928.199270] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Caching image {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 928.199515] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Copying Virtual Disk [datastore1] vmware_temp/d7cc4497-d4e4-47c4-b940-77fda30aecb5/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk to [datastore1] vmware_temp/d7cc4497-d4e4-47c4-b940-77fda30aecb5/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk {{(pid=60722) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 928.199781] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-31c4b302-e6e9-4f47-96b3-d8aaba61a615 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.207847] env[60722]: DEBUG oslo_vmware.api [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Waiting for the task: (returnval){ [ 928.207847] env[60722]: value = "task-565185" [ 928.207847] env[60722]: _type = "Task" [ 928.207847] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 928.215449] env[60722]: DEBUG oslo_vmware.api [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Task: {'id': task-565185, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 928.718136] env[60722]: DEBUG oslo_vmware.exceptions [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Fault InvalidArgument not matched. {{(pid=60722) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 928.718361] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 928.718883] env[60722]: ERROR nova.compute.manager [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 928.718883] env[60722]: Faults: ['InvalidArgument'] [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Traceback (most recent call last): [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] yield resources [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] self.driver.spawn(context, instance, image_meta, [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] self._vmops.spawn(context, instance, image_meta, injected_files, [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] self._fetch_image_if_missing(context, vi) [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] image_cache(vi, tmp_image_ds_loc) [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] vm_util.copy_virtual_disk( [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] session._wait_for_task(vmdk_copy_task) [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] return self.wait_for_task(task_ref) [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] return evt.wait() [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] result = hub.switch() [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] return self.greenlet.switch() [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] self.f(*self.args, **self.kw) [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] raise exceptions.translate_fault(task_info.error) [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Faults: ['InvalidArgument'] [ 928.718883] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] [ 928.720130] env[60722]: INFO nova.compute.manager [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Terminating instance [ 928.720720] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 928.720987] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 928.721234] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-667edd81-68e7-4af5-b9d5-edc6235a3279 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.723600] env[60722]: DEBUG nova.compute.manager [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 928.723794] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 928.724510] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1d280fa-558f-4bb9-9dd1-f6b7914a89da {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.731164] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 928.731337] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ce8f2c36-ff83-448d-ba4f-2b60277f155e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.733483] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 928.733645] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 928.734913] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-25a31a10-186f-49b8-a79a-61e546aa2d26 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.739400] env[60722]: DEBUG oslo_vmware.api [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Waiting for the task: (returnval){ [ 928.739400] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52294716-a387-897b-b4e0-57eaab5d2313" [ 928.739400] env[60722]: _type = "Task" [ 928.739400] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 928.753016] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 928.753247] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Creating directory with path [datastore1] vmware_temp/baba356a-a489-4152-8095-c6d47cf04719/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 928.753442] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9a3482bb-26de-440b-b87e-474c0d4e1e68 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.773438] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Created directory with path [datastore1] vmware_temp/baba356a-a489-4152-8095-c6d47cf04719/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 928.773622] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Fetch image to [datastore1] vmware_temp/baba356a-a489-4152-8095-c6d47cf04719/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 928.773785] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/baba356a-a489-4152-8095-c6d47cf04719/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 928.774525] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67091a05-a5d8-4803-a7dc-4dd332be5773 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.780882] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed5057c3-ef43-4120-bb97-b39e4a7986cf {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.789735] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5d66695-3e63-49f6-baad-b727cdca8394 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.822016] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-899ab216-3627-43d5-9509-f4e3976605f5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.824639] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 928.824817] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 928.825012] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Deleting the datastore file [datastore1] bc2a1e45-2f48-4a73-bfee-69a20725a610 {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 928.825254] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ffe67f3a-4473-4fa3-8f79-c4e6156a38ba {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.829896] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-707b2e76-8eab-489c-87db-9d88eee9064c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 928.832771] env[60722]: DEBUG oslo_vmware.api [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Waiting for the task: (returnval){ [ 928.832771] env[60722]: value = "task-565187" [ 928.832771] env[60722]: _type = "Task" [ 928.832771] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 928.840181] env[60722]: DEBUG oslo_vmware.api [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Task: {'id': task-565187, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 928.851406] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 928.895540] env[60722]: DEBUG oslo_vmware.rw_handles [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/baba356a-a489-4152-8095-c6d47cf04719/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 928.951617] env[60722]: DEBUG oslo_vmware.rw_handles [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Completed reading data from the image iterator. {{(pid=60722) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 928.951849] env[60722]: DEBUG oslo_vmware.rw_handles [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/baba356a-a489-4152-8095-c6d47cf04719/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 929.343479] env[60722]: DEBUG oslo_vmware.api [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Task: {'id': task-565187, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07319} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 929.343814] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 929.343868] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 929.344069] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 929.344241] env[60722]: INFO nova.compute.manager [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Took 0.62 seconds to destroy the instance on the hypervisor. [ 929.346277] env[60722]: DEBUG nova.compute.claims [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 929.346439] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 929.346639] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 929.484694] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c966fc9-1e03-4c9c-9b7e-2d06ee631a6d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 929.491425] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4a9a97b-444f-4367-8858-0d0aa6456295 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 929.520941] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07f10b0c-e895-4bb9-b8b3-a3fed7b9776a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 929.527439] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ded0c24-6022-4b2d-b8a0-b9c0899c0028 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 929.540062] env[60722]: DEBUG nova.compute.provider_tree [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 929.548295] env[60722]: DEBUG nova.scheduler.client.report [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 929.560994] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.214s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 929.561499] env[60722]: ERROR nova.compute.manager [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 929.561499] env[60722]: Faults: ['InvalidArgument'] [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Traceback (most recent call last): [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] self.driver.spawn(context, instance, image_meta, [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] self._vmops.spawn(context, instance, image_meta, injected_files, [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] self._fetch_image_if_missing(context, vi) [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] image_cache(vi, tmp_image_ds_loc) [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] vm_util.copy_virtual_disk( [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] session._wait_for_task(vmdk_copy_task) [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] return self.wait_for_task(task_ref) [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] return evt.wait() [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] result = hub.switch() [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] return self.greenlet.switch() [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] self.f(*self.args, **self.kw) [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] raise exceptions.translate_fault(task_info.error) [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Faults: ['InvalidArgument'] [ 929.561499] env[60722]: ERROR nova.compute.manager [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] [ 929.562548] env[60722]: DEBUG nova.compute.utils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] VimFaultException {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 929.563507] env[60722]: DEBUG nova.compute.manager [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Build of instance bc2a1e45-2f48-4a73-bfee-69a20725a610 was re-scheduled: A specified parameter was not correct: fileType [ 929.563507] env[60722]: Faults: ['InvalidArgument'] {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 929.563902] env[60722]: DEBUG nova.compute.manager [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 929.564110] env[60722]: DEBUG nova.compute.manager [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 929.564309] env[60722]: DEBUG nova.compute.manager [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 929.564438] env[60722]: DEBUG nova.network.neutron [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 929.833194] env[60722]: DEBUG nova.network.neutron [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 929.850660] env[60722]: INFO nova.compute.manager [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Took 0.28 seconds to deallocate network for instance. [ 929.933985] env[60722]: INFO nova.scheduler.client.report [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Deleted allocations for instance bc2a1e45-2f48-4a73-bfee-69a20725a610 [ 929.952426] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a9d01a0a-72b2-4596-914c-dec6d30e1d5f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Lock "bc2a1e45-2f48-4a73-bfee-69a20725a610" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 334.811s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 929.953617] env[60722]: DEBUG oslo_concurrency.lockutils [None req-32d5ec17-d04e-4ae2-9085-4126161f478a tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Lock "bc2a1e45-2f48-4a73-bfee-69a20725a610" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 135.939s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 929.953822] env[60722]: DEBUG oslo_concurrency.lockutils [None req-32d5ec17-d04e-4ae2-9085-4126161f478a tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquiring lock "bc2a1e45-2f48-4a73-bfee-69a20725a610-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 929.954026] env[60722]: DEBUG oslo_concurrency.lockutils [None req-32d5ec17-d04e-4ae2-9085-4126161f478a tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Lock "bc2a1e45-2f48-4a73-bfee-69a20725a610-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 929.954192] env[60722]: DEBUG oslo_concurrency.lockutils [None req-32d5ec17-d04e-4ae2-9085-4126161f478a tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Lock "bc2a1e45-2f48-4a73-bfee-69a20725a610-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 929.960241] env[60722]: INFO nova.compute.manager [None req-32d5ec17-d04e-4ae2-9085-4126161f478a tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Terminating instance [ 929.964016] env[60722]: DEBUG nova.compute.manager [None req-32d5ec17-d04e-4ae2-9085-4126161f478a tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 929.964016] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-32d5ec17-d04e-4ae2-9085-4126161f478a tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 929.964016] env[60722]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-41edccea-98c2-4821-b5a7-aa94c7ac692a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 929.967517] env[60722]: DEBUG nova.compute.manager [None req-9f364c40-2870-4552-952f-1588106b934b tempest-SecurityGroupsTestJSON-205882840 tempest-SecurityGroupsTestJSON-205882840-project-member] [instance: 151df220-ca11-4455-b620-f8fe5a1be5b7] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 929.973491] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2d98ad4-6b53-40e8-8ff5-be4b252ad393 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 929.990702] env[60722]: DEBUG nova.compute.manager [None req-9f364c40-2870-4552-952f-1588106b934b tempest-SecurityGroupsTestJSON-205882840 tempest-SecurityGroupsTestJSON-205882840-project-member] [instance: 151df220-ca11-4455-b620-f8fe5a1be5b7] Instance disappeared before build. {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 930.000311] env[60722]: WARNING nova.virt.vmwareapi.vmops [None req-32d5ec17-d04e-4ae2-9085-4126161f478a tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance bc2a1e45-2f48-4a73-bfee-69a20725a610 could not be found. [ 930.000503] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-32d5ec17-d04e-4ae2-9085-4126161f478a tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 930.000666] env[60722]: INFO nova.compute.manager [None req-32d5ec17-d04e-4ae2-9085-4126161f478a tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Took 0.04 seconds to destroy the instance on the hypervisor. [ 930.000907] env[60722]: DEBUG oslo.service.loopingcall [None req-32d5ec17-d04e-4ae2-9085-4126161f478a tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 930.001124] env[60722]: DEBUG nova.compute.manager [-] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 930.001218] env[60722]: DEBUG nova.network.neutron [-] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 930.020224] env[60722]: DEBUG oslo_concurrency.lockutils [None req-9f364c40-2870-4552-952f-1588106b934b tempest-SecurityGroupsTestJSON-205882840 tempest-SecurityGroupsTestJSON-205882840-project-member] Lock "151df220-ca11-4455-b620-f8fe5a1be5b7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 241.505s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 930.022647] env[60722]: DEBUG nova.network.neutron [-] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 930.030357] env[60722]: INFO nova.compute.manager [-] [instance: bc2a1e45-2f48-4a73-bfee-69a20725a610] Took 0.03 seconds to deallocate network for instance. [ 930.030643] env[60722]: DEBUG nova.compute.manager [None req-51fd68c0-ea71-4901-8fd0-76820c3d7c52 tempest-ServerPasswordTestJSON-1030397772 tempest-ServerPasswordTestJSON-1030397772-project-member] [instance: 47f34953-a5e3-4b5e-9164-ec5980802298] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 930.053134] env[60722]: DEBUG nova.compute.manager [None req-51fd68c0-ea71-4901-8fd0-76820c3d7c52 tempest-ServerPasswordTestJSON-1030397772 tempest-ServerPasswordTestJSON-1030397772-project-member] [instance: 47f34953-a5e3-4b5e-9164-ec5980802298] Instance disappeared before build. {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 930.072041] env[60722]: DEBUG oslo_concurrency.lockutils [None req-51fd68c0-ea71-4901-8fd0-76820c3d7c52 tempest-ServerPasswordTestJSON-1030397772 tempest-ServerPasswordTestJSON-1030397772-project-member] Lock "47f34953-a5e3-4b5e-9164-ec5980802298" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 241.207s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 930.086708] env[60722]: DEBUG nova.compute.manager [None req-dfc281b1-df48-410b-a8dc-dafe3b200ee8 tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 7f1d5c92-ea40-4ad1-b669-877028b69711] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 930.108980] env[60722]: DEBUG nova.compute.manager [None req-dfc281b1-df48-410b-a8dc-dafe3b200ee8 tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] [instance: 7f1d5c92-ea40-4ad1-b669-877028b69711] Instance disappeared before build. {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 930.128225] env[60722]: DEBUG oslo_concurrency.lockutils [None req-dfc281b1-df48-410b-a8dc-dafe3b200ee8 tempest-DeleteServersAdminTestJSON-935499153 tempest-DeleteServersAdminTestJSON-935499153-project-member] Lock "7f1d5c92-ea40-4ad1-b669-877028b69711" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 240.438s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 930.143421] env[60722]: DEBUG nova.compute.manager [None req-2f879def-e0e0-4041-95c5-ecdafea31b68 tempest-ServersTestBootFromVolume-2075344351 tempest-ServersTestBootFromVolume-2075344351-project-member] [instance: ecade932-ccf5-4a5c-8348-6d88a311f3a1] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 930.145896] env[60722]: DEBUG oslo_concurrency.lockutils [None req-32d5ec17-d04e-4ae2-9085-4126161f478a tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Lock "bc2a1e45-2f48-4a73-bfee-69a20725a610" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.192s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 930.165142] env[60722]: DEBUG nova.compute.manager [None req-2f879def-e0e0-4041-95c5-ecdafea31b68 tempest-ServersTestBootFromVolume-2075344351 tempest-ServersTestBootFromVolume-2075344351-project-member] [instance: ecade932-ccf5-4a5c-8348-6d88a311f3a1] Instance disappeared before build. {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 930.183737] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2f879def-e0e0-4041-95c5-ecdafea31b68 tempest-ServersTestBootFromVolume-2075344351 tempest-ServersTestBootFromVolume-2075344351-project-member] Lock "ecade932-ccf5-4a5c-8348-6d88a311f3a1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 240.087s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 930.193169] env[60722]: DEBUG nova.compute.manager [None req-8d7183c9-1482-4f24-8e20-e57843b3ab2c tempest-ServersV294TestFqdnHostnames-1444459050 tempest-ServersV294TestFqdnHostnames-1444459050-project-member] [instance: f60a4fc6-1a93-4f54-8b34-71aa0a2e035c] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 930.215300] env[60722]: DEBUG nova.compute.manager [None req-8d7183c9-1482-4f24-8e20-e57843b3ab2c tempest-ServersV294TestFqdnHostnames-1444459050 tempest-ServersV294TestFqdnHostnames-1444459050-project-member] [instance: f60a4fc6-1a93-4f54-8b34-71aa0a2e035c] Instance disappeared before build. {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 930.234305] env[60722]: DEBUG oslo_concurrency.lockutils [None req-8d7183c9-1482-4f24-8e20-e57843b3ab2c tempest-ServersV294TestFqdnHostnames-1444459050 tempest-ServersV294TestFqdnHostnames-1444459050-project-member] Lock "f60a4fc6-1a93-4f54-8b34-71aa0a2e035c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.681s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 930.241916] env[60722]: DEBUG nova.compute.manager [None req-7b0464b1-c34a-4a7b-a452-3247651f91c9 tempest-AttachVolumeNegativeTest-1992155444 tempest-AttachVolumeNegativeTest-1992155444-project-member] [instance: 22eaddb2-108d-44e4-8d9d-85fc91efa9f9] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 930.265572] env[60722]: DEBUG nova.compute.manager [None req-7b0464b1-c34a-4a7b-a452-3247651f91c9 tempest-AttachVolumeNegativeTest-1992155444 tempest-AttachVolumeNegativeTest-1992155444-project-member] [instance: 22eaddb2-108d-44e4-8d9d-85fc91efa9f9] Instance disappeared before build. {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 930.285119] env[60722]: DEBUG oslo_concurrency.lockutils [None req-7b0464b1-c34a-4a7b-a452-3247651f91c9 tempest-AttachVolumeNegativeTest-1992155444 tempest-AttachVolumeNegativeTest-1992155444-project-member] Lock "22eaddb2-108d-44e4-8d9d-85fc91efa9f9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.209s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 930.292250] env[60722]: DEBUG nova.compute.manager [None req-29b03021-5125-4808-8eb6-162ef4d727f0 tempest-ServerActionsTestOtherB-2034715177 tempest-ServerActionsTestOtherB-2034715177-project-member] [instance: 825f4673-e46f-4b32-a3d8-4bb163ac2390] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 930.314433] env[60722]: DEBUG nova.compute.manager [None req-29b03021-5125-4808-8eb6-162ef4d727f0 tempest-ServerActionsTestOtherB-2034715177 tempest-ServerActionsTestOtherB-2034715177-project-member] [instance: 825f4673-e46f-4b32-a3d8-4bb163ac2390] Instance disappeared before build. {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 930.333151] env[60722]: DEBUG oslo_concurrency.lockutils [None req-29b03021-5125-4808-8eb6-162ef4d727f0 tempest-ServerActionsTestOtherB-2034715177 tempest-ServerActionsTestOtherB-2034715177-project-member] Lock "825f4673-e46f-4b32-a3d8-4bb163ac2390" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.022s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 930.341131] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 930.389020] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 930.389316] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 930.390723] env[60722]: INFO nova.compute.claims [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 930.519365] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-712e4e7b-41ab-43ad-a720-5a1b0e4c758c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.526803] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20dd572d-6241-4439-91c9-4ac91a1c563b {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.556985] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-203151e9-c32d-4cb7-8307-dfca9a8539eb {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.564163] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae4238c6-c570-42eb-adf6-3e9bf85915fb {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.577312] env[60722]: DEBUG nova.compute.provider_tree [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 930.585962] env[60722]: DEBUG nova.scheduler.client.report [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 930.598446] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.209s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 930.599447] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 930.631994] env[60722]: DEBUG nova.compute.utils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 930.633308] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 930.633483] env[60722]: DEBUG nova.network.neutron [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 930.646298] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 930.716723] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 930.737970] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 930.738288] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 930.738472] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 930.738683] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 930.738855] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 930.739037] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 930.739280] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 930.739439] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 930.739637] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 930.739844] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 930.740062] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 930.741635] env[60722]: DEBUG nova.policy [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5a70e6a9017043d5ab0b627f0a1423e8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '24a558c40311424d9c88a84256be240b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 930.743869] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58c5e283-4ff1-4c36-9539-865127b415f1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 930.752196] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf6d5c29-2ddf-42e9-81f5-b9ae2a699a39 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.240720] env[60722]: DEBUG nova.network.neutron [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Successfully created port: 0e53ccc1-3ce8-4448-af31-4820082696dc {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 931.686799] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquiring lock "2d58b057-fec8-4c3c-bf83-452d27abfd38" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 931.687415] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Lock "2d58b057-fec8-4c3c-bf83-452d27abfd38" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 931.787114] env[60722]: DEBUG nova.network.neutron [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Successfully updated port: 0e53ccc1-3ce8-4448-af31-4820082696dc {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 931.800031] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquiring lock "refresh_cache-b9025e22-8080-4887-8e4e-179866f704ca" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 931.800187] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquired lock "refresh_cache-b9025e22-8080-4887-8e4e-179866f704ca" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 931.800334] env[60722]: DEBUG nova.network.neutron [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 931.839551] env[60722]: DEBUG nova.network.neutron [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 931.910280] env[60722]: DEBUG nova.compute.manager [req-d0a186e8-df1d-4dc4-8e55-58b21217831b req-cbbe00bc-78bc-4fbd-a554-d065c1008f3d service nova] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Received event network-vif-plugged-0e53ccc1-3ce8-4448-af31-4820082696dc {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 931.910834] env[60722]: DEBUG oslo_concurrency.lockutils [req-d0a186e8-df1d-4dc4-8e55-58b21217831b req-cbbe00bc-78bc-4fbd-a554-d065c1008f3d service nova] Acquiring lock "b9025e22-8080-4887-8e4e-179866f704ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 931.911059] env[60722]: DEBUG oslo_concurrency.lockutils [req-d0a186e8-df1d-4dc4-8e55-58b21217831b req-cbbe00bc-78bc-4fbd-a554-d065c1008f3d service nova] Lock "b9025e22-8080-4887-8e4e-179866f704ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 931.911227] env[60722]: DEBUG oslo_concurrency.lockutils [req-d0a186e8-df1d-4dc4-8e55-58b21217831b req-cbbe00bc-78bc-4fbd-a554-d065c1008f3d service nova] Lock "b9025e22-8080-4887-8e4e-179866f704ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 931.911398] env[60722]: DEBUG nova.compute.manager [req-d0a186e8-df1d-4dc4-8e55-58b21217831b req-cbbe00bc-78bc-4fbd-a554-d065c1008f3d service nova] [instance: b9025e22-8080-4887-8e4e-179866f704ca] No waiting events found dispatching network-vif-plugged-0e53ccc1-3ce8-4448-af31-4820082696dc {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 931.911557] env[60722]: WARNING nova.compute.manager [req-d0a186e8-df1d-4dc4-8e55-58b21217831b req-cbbe00bc-78bc-4fbd-a554-d065c1008f3d service nova] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Received unexpected event network-vif-plugged-0e53ccc1-3ce8-4448-af31-4820082696dc for instance with vm_state building and task_state spawning. [ 931.911711] env[60722]: DEBUG nova.compute.manager [req-d0a186e8-df1d-4dc4-8e55-58b21217831b req-cbbe00bc-78bc-4fbd-a554-d065c1008f3d service nova] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Received event network-changed-0e53ccc1-3ce8-4448-af31-4820082696dc {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 931.911858] env[60722]: DEBUG nova.compute.manager [req-d0a186e8-df1d-4dc4-8e55-58b21217831b req-cbbe00bc-78bc-4fbd-a554-d065c1008f3d service nova] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Refreshing instance network info cache due to event network-changed-0e53ccc1-3ce8-4448-af31-4820082696dc. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 931.912046] env[60722]: DEBUG oslo_concurrency.lockutils [req-d0a186e8-df1d-4dc4-8e55-58b21217831b req-cbbe00bc-78bc-4fbd-a554-d065c1008f3d service nova] Acquiring lock "refresh_cache-b9025e22-8080-4887-8e4e-179866f704ca" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 932.005914] env[60722]: DEBUG nova.network.neutron [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Updating instance_info_cache with network_info: [{"id": "0e53ccc1-3ce8-4448-af31-4820082696dc", "address": "fa:16:3e:2d:05:b9", "network": {"id": "2fab50bd-0c37-4e7b-a49a-8a07f4f942a9", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-513956274-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "24a558c40311424d9c88a84256be240b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1fb81f98-6f5a-47ab-a512-27277591d064", "external-id": "nsx-vlan-transportzone-624", "segmentation_id": 624, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0e53ccc1-3c", "ovs_interfaceid": "0e53ccc1-3ce8-4448-af31-4820082696dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 932.016309] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Releasing lock "refresh_cache-b9025e22-8080-4887-8e4e-179866f704ca" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 932.016577] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Instance network_info: |[{"id": "0e53ccc1-3ce8-4448-af31-4820082696dc", "address": "fa:16:3e:2d:05:b9", "network": {"id": "2fab50bd-0c37-4e7b-a49a-8a07f4f942a9", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-513956274-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "24a558c40311424d9c88a84256be240b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1fb81f98-6f5a-47ab-a512-27277591d064", "external-id": "nsx-vlan-transportzone-624", "segmentation_id": 624, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0e53ccc1-3c", "ovs_interfaceid": "0e53ccc1-3ce8-4448-af31-4820082696dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 932.016837] env[60722]: DEBUG oslo_concurrency.lockutils [req-d0a186e8-df1d-4dc4-8e55-58b21217831b req-cbbe00bc-78bc-4fbd-a554-d065c1008f3d service nova] Acquired lock "refresh_cache-b9025e22-8080-4887-8e4e-179866f704ca" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 932.017013] env[60722]: DEBUG nova.network.neutron [req-d0a186e8-df1d-4dc4-8e55-58b21217831b req-cbbe00bc-78bc-4fbd-a554-d065c1008f3d service nova] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Refreshing network info cache for port 0e53ccc1-3ce8-4448-af31-4820082696dc {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 932.017978] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2d:05:b9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1fb81f98-6f5a-47ab-a512-27277591d064', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0e53ccc1-3ce8-4448-af31-4820082696dc', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 932.026174] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Creating folder: Project (24a558c40311424d9c88a84256be240b). Parent ref: group-v141606. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 932.027082] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1964ad2f-87dc-4a3e-896b-0fe4f355e609 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 932.039862] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Created folder: Project (24a558c40311424d9c88a84256be240b) in parent group-v141606. [ 932.040060] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Creating folder: Instances. Parent ref: group-v141653. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 932.040284] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d6dce3b7-eb3f-42a4-b982-2f384fac9091 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 932.049228] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Created folder: Instances in parent group-v141653. [ 932.049429] env[60722]: DEBUG oslo.service.loopingcall [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 932.049598] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 932.049774] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1beaab5c-8253-48d6-8add-d76375d513a3 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 932.071724] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 932.071724] env[60722]: value = "task-565190" [ 932.071724] env[60722]: _type = "Task" [ 932.071724] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 932.078763] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565190, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 932.304038] env[60722]: DEBUG nova.network.neutron [req-d0a186e8-df1d-4dc4-8e55-58b21217831b req-cbbe00bc-78bc-4fbd-a554-d065c1008f3d service nova] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Updated VIF entry in instance network info cache for port 0e53ccc1-3ce8-4448-af31-4820082696dc. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 932.304394] env[60722]: DEBUG nova.network.neutron [req-d0a186e8-df1d-4dc4-8e55-58b21217831b req-cbbe00bc-78bc-4fbd-a554-d065c1008f3d service nova] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Updating instance_info_cache with network_info: [{"id": "0e53ccc1-3ce8-4448-af31-4820082696dc", "address": "fa:16:3e:2d:05:b9", "network": {"id": "2fab50bd-0c37-4e7b-a49a-8a07f4f942a9", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-513956274-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "24a558c40311424d9c88a84256be240b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1fb81f98-6f5a-47ab-a512-27277591d064", "external-id": "nsx-vlan-transportzone-624", "segmentation_id": 624, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0e53ccc1-3c", "ovs_interfaceid": "0e53ccc1-3ce8-4448-af31-4820082696dc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 932.313354] env[60722]: DEBUG oslo_concurrency.lockutils [req-d0a186e8-df1d-4dc4-8e55-58b21217831b req-cbbe00bc-78bc-4fbd-a554-d065c1008f3d service nova] Releasing lock "refresh_cache-b9025e22-8080-4887-8e4e-179866f704ca" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 932.580669] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565190, 'name': CreateVM_Task, 'duration_secs': 0.278117} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 932.580852] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 932.581510] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 932.581669] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 932.581983] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 932.582230] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e1fdce63-7948-4f80-9cdf-e39b1d9e2284 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 932.586423] env[60722]: DEBUG oslo_vmware.api [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Waiting for the task: (returnval){ [ 932.586423] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52e42fb9-85da-35ed-3af5-aed173ab59e6" [ 932.586423] env[60722]: _type = "Task" [ 932.586423] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 932.593634] env[60722]: DEBUG oslo_vmware.api [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52e42fb9-85da-35ed-3af5-aed173ab59e6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 933.096816] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 933.097188] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 933.097237] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 958.274054] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 958.274054] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 958.274054] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60722) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 958.945362] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 959.940058] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 959.943607] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 960.943666] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 960.943927] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 960.953326] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 960.953530] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 960.953687] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 960.953858] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60722) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 960.954898] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4e88389-c38a-4f3a-b492-0cd19f295712 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.963470] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6dbe58f-4a35-4fe0-b5b4-2a07e0ea9f77 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.978047] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-108b0913-8f1b-4c66-b79b-030da8adbc7c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.983193] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f12da6e-b57d-4a0f-b643-9c7471440276 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.012995] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181679MB free_disk=100GB free_vcpus=48 pci_devices=None {{(pid=60722) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 961.013144] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 961.013305] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 961.058486] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 65901b4a-42cf-4795-abc2-b0fea1f4fee7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 961.058627] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance e93b8d4b-6286-410a-870a-02fa7e59d90d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 961.058753] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 93268011-e1f2-4041-b4df-473c06d3f1eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 961.058874] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance b9025e22-8080-4887-8e4e-179866f704ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 961.068667] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 020c2b79-e755-4178-aa85-5ecaa31e7a9f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 961.078036] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 8d78f310-a2f2-4073-8371-afc42cc566f2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 961.086423] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 2d58b057-fec8-4c3c-bf83-452d27abfd38 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 961.086612] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 961.086756] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1088MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 961.167278] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d568fead-0b01-4e99-af15-5bf5abc133cb {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.175098] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4728d63b-932f-4219-b8de-c4bb0f2fbee2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.203975] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9a885b2-7514-416d-abbf-0f9ea4e8e69c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.210934] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4443140d-8a70-4913-ab4e-0cb0e5700db9 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.223720] env[60722]: DEBUG nova.compute.provider_tree [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 961.231638] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 961.243772] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60722) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 961.243957] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.231s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 962.244717] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 962.945586] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 962.945802] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Starting heal instance info cache {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 962.945802] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Rebuilding the list of instances to heal {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 962.960013] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 962.960188] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 962.960319] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 962.960451] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 962.960574] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Didn't find any instances for network info cache update. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 976.070431] env[60722]: WARNING oslo_vmware.rw_handles [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 976.070431] env[60722]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 976.070431] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 976.070431] env[60722]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 976.070431] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 976.070431] env[60722]: ERROR oslo_vmware.rw_handles response.begin() [ 976.070431] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 976.070431] env[60722]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 976.070431] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 976.070431] env[60722]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 976.070431] env[60722]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 976.070431] env[60722]: ERROR oslo_vmware.rw_handles [ 976.070960] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Downloaded image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to vmware_temp/baba356a-a489-4152-8095-c6d47cf04719/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 976.072414] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Caching image {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 976.072669] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Copying Virtual Disk [datastore1] vmware_temp/baba356a-a489-4152-8095-c6d47cf04719/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk to [datastore1] vmware_temp/baba356a-a489-4152-8095-c6d47cf04719/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk {{(pid=60722) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 976.072992] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7974d6a6-e2ae-40fb-b825-a037a32e030e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.082018] env[60722]: DEBUG oslo_vmware.api [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Waiting for the task: (returnval){ [ 976.082018] env[60722]: value = "task-565191" [ 976.082018] env[60722]: _type = "Task" [ 976.082018] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 976.089317] env[60722]: DEBUG oslo_vmware.api [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Task: {'id': task-565191, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 976.592603] env[60722]: DEBUG oslo_vmware.exceptions [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Fault InvalidArgument not matched. {{(pid=60722) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 976.592870] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 976.593415] env[60722]: ERROR nova.compute.manager [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 976.593415] env[60722]: Faults: ['InvalidArgument'] [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Traceback (most recent call last): [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] yield resources [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] self.driver.spawn(context, instance, image_meta, [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] self._fetch_image_if_missing(context, vi) [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] image_cache(vi, tmp_image_ds_loc) [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] vm_util.copy_virtual_disk( [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] session._wait_for_task(vmdk_copy_task) [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] return self.wait_for_task(task_ref) [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] return evt.wait() [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] result = hub.switch() [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] return self.greenlet.switch() [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] self.f(*self.args, **self.kw) [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] raise exceptions.translate_fault(task_info.error) [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Faults: ['InvalidArgument'] [ 976.593415] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] [ 976.594206] env[60722]: INFO nova.compute.manager [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Terminating instance [ 976.595288] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 976.595489] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 976.595710] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-20a0adb3-2fb4-4db1-adf8-4f02fbfe36d5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.597828] env[60722]: DEBUG nova.compute.manager [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 976.597984] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 976.598668] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5110d6d9-5a82-4dfb-8caa-64d246c8249d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.605174] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 976.605362] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-579ae85d-1ad2-4696-8fc6-8706a82dc66a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.607319] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 976.607483] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 976.608374] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ed36672e-ff28-4ad6-bab1-e2a748c40943 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.612632] env[60722]: DEBUG oslo_vmware.api [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Waiting for the task: (returnval){ [ 976.612632] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52975512-5432-f3af-e5e7-75b5c8a69ceb" [ 976.612632] env[60722]: _type = "Task" [ 976.612632] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 976.619396] env[60722]: DEBUG oslo_vmware.api [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52975512-5432-f3af-e5e7-75b5c8a69ceb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 976.673055] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 976.673272] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 976.673444] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Deleting the datastore file [datastore1] 93268011-e1f2-4041-b4df-473c06d3f1eb {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 976.673694] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b9217832-0312-41e5-866a-7217240dd8ed {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.679384] env[60722]: DEBUG oslo_vmware.api [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Waiting for the task: (returnval){ [ 976.679384] env[60722]: value = "task-565193" [ 976.679384] env[60722]: _type = "Task" [ 976.679384] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 976.686822] env[60722]: DEBUG oslo_vmware.api [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Task: {'id': task-565193, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 977.122328] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 977.122587] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Creating directory with path [datastore1] vmware_temp/33be67a5-ff4c-433e-8e81-a0c39f2cd212/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 977.122838] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-33367b9c-1a78-43c1-959d-16bcc292a64d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.135259] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Created directory with path [datastore1] vmware_temp/33be67a5-ff4c-433e-8e81-a0c39f2cd212/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 977.135451] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Fetch image to [datastore1] vmware_temp/33be67a5-ff4c-433e-8e81-a0c39f2cd212/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 977.135614] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/33be67a5-ff4c-433e-8e81-a0c39f2cd212/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 977.136370] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd057e8b-6814-48ab-a45f-faf9b3a0d6e8 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.142894] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c88f88dd-c82d-4c1a-8e19-4cd477c0cfc4 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.152196] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-523fffe7-3c64-4864-85a7-4b810c0638b5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.185033] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1abe6183-5a8a-460d-a10f-4e12bee04998 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.191687] env[60722]: DEBUG oslo_vmware.api [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Task: {'id': task-565193, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079708} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 977.193038] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 977.193228] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 977.193395] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 977.193562] env[60722]: INFO nova.compute.manager [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Took 0.60 seconds to destroy the instance on the hypervisor. [ 977.195420] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-84950c06-96cf-4818-a001-a8e2ffd28cbd {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.197250] env[60722]: DEBUG nova.compute.claims [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 977.197420] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 977.197622] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 977.217152] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 977.263114] env[60722]: DEBUG oslo_vmware.rw_handles [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/33be67a5-ff4c-433e-8e81-a0c39f2cd212/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 977.317366] env[60722]: DEBUG oslo_vmware.rw_handles [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Completed reading data from the image iterator. {{(pid=60722) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 977.317540] env[60722]: DEBUG oslo_vmware.rw_handles [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/33be67a5-ff4c-433e-8e81-a0c39f2cd212/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 977.364884] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1baa81eb-991c-4f64-a4b6-078996736b59 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.372139] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e616814-4661-4f0f-b528-cf965d4efb4a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.400814] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5912ba61-c4cf-4bd9-8649-7cd16c63b786 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.407329] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11683c78-38f6-4dd0-8032-b22940a10416 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.420622] env[60722]: DEBUG nova.compute.provider_tree [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 977.428869] env[60722]: DEBUG nova.scheduler.client.report [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 977.443136] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.245s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 977.443617] env[60722]: ERROR nova.compute.manager [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 977.443617] env[60722]: Faults: ['InvalidArgument'] [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Traceback (most recent call last): [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] self.driver.spawn(context, instance, image_meta, [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] self._fetch_image_if_missing(context, vi) [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] image_cache(vi, tmp_image_ds_loc) [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] vm_util.copy_virtual_disk( [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] session._wait_for_task(vmdk_copy_task) [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] return self.wait_for_task(task_ref) [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] return evt.wait() [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] result = hub.switch() [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] return self.greenlet.switch() [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] self.f(*self.args, **self.kw) [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] raise exceptions.translate_fault(task_info.error) [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Faults: ['InvalidArgument'] [ 977.443617] env[60722]: ERROR nova.compute.manager [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] [ 977.444414] env[60722]: DEBUG nova.compute.utils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] VimFaultException {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 977.445612] env[60722]: DEBUG nova.compute.manager [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Build of instance 93268011-e1f2-4041-b4df-473c06d3f1eb was re-scheduled: A specified parameter was not correct: fileType [ 977.445612] env[60722]: Faults: ['InvalidArgument'] {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 977.445972] env[60722]: DEBUG nova.compute.manager [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 977.446155] env[60722]: DEBUG nova.compute.manager [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 977.446318] env[60722]: DEBUG nova.compute.manager [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 977.446471] env[60722]: DEBUG nova.network.neutron [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 977.705494] env[60722]: DEBUG nova.network.neutron [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 977.714835] env[60722]: INFO nova.compute.manager [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Took 0.27 seconds to deallocate network for instance. [ 977.799911] env[60722]: INFO nova.scheduler.client.report [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Deleted allocations for instance 93268011-e1f2-4041-b4df-473c06d3f1eb [ 977.818403] env[60722]: DEBUG oslo_concurrency.lockutils [None req-b1508e61-f144-4c62-a4d7-8b7c3a5ba987 tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Lock "93268011-e1f2-4041-b4df-473c06d3f1eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 379.184s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 977.819491] env[60722]: DEBUG oslo_concurrency.lockutils [None req-5e2d225f-09d7-47d4-9387-11bc06c658ac tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Lock "93268011-e1f2-4041-b4df-473c06d3f1eb" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 178.563s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 977.819701] env[60722]: DEBUG oslo_concurrency.lockutils [None req-5e2d225f-09d7-47d4-9387-11bc06c658ac tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Acquiring lock "93268011-e1f2-4041-b4df-473c06d3f1eb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 977.819892] env[60722]: DEBUG oslo_concurrency.lockutils [None req-5e2d225f-09d7-47d4-9387-11bc06c658ac tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Lock "93268011-e1f2-4041-b4df-473c06d3f1eb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 977.820324] env[60722]: DEBUG oslo_concurrency.lockutils [None req-5e2d225f-09d7-47d4-9387-11bc06c658ac tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Lock "93268011-e1f2-4041-b4df-473c06d3f1eb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 977.823041] env[60722]: INFO nova.compute.manager [None req-5e2d225f-09d7-47d4-9387-11bc06c658ac tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Terminating instance [ 977.824452] env[60722]: DEBUG nova.compute.manager [None req-5e2d225f-09d7-47d4-9387-11bc06c658ac tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 977.824643] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-5e2d225f-09d7-47d4-9387-11bc06c658ac tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 977.825093] env[60722]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0b1505f4-0034-4ea4-9ee6-0d8989488b8e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.834123] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f768967b-45b7-466e-b67a-8e483edca3b4 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 977.844690] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 977.863765] env[60722]: WARNING nova.virt.vmwareapi.vmops [None req-5e2d225f-09d7-47d4-9387-11bc06c658ac tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 93268011-e1f2-4041-b4df-473c06d3f1eb could not be found. [ 977.863954] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-5e2d225f-09d7-47d4-9387-11bc06c658ac tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 977.864149] env[60722]: INFO nova.compute.manager [None req-5e2d225f-09d7-47d4-9387-11bc06c658ac tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Took 0.04 seconds to destroy the instance on the hypervisor. [ 977.864385] env[60722]: DEBUG oslo.service.loopingcall [None req-5e2d225f-09d7-47d4-9387-11bc06c658ac tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 977.864592] env[60722]: DEBUG nova.compute.manager [-] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 977.864685] env[60722]: DEBUG nova.network.neutron [-] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 977.889910] env[60722]: DEBUG nova.network.neutron [-] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 977.894532] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 977.894754] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 977.896561] env[60722]: INFO nova.compute.claims [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 977.899059] env[60722]: INFO nova.compute.manager [-] [instance: 93268011-e1f2-4041-b4df-473c06d3f1eb] Took 0.03 seconds to deallocate network for instance. [ 977.983709] env[60722]: DEBUG oslo_concurrency.lockutils [None req-5e2d225f-09d7-47d4-9387-11bc06c658ac tempest-ServersAdminTestJSON-1575840493 tempest-ServersAdminTestJSON-1575840493-project-member] Lock "93268011-e1f2-4041-b4df-473c06d3f1eb" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.164s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 978.024035] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c810154-9aec-48f8-9421-5d17834709ad {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 978.031976] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1318b140-fff6-4764-ac7d-cb7ac160522e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 978.062072] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6649997b-bbd6-401e-b130-c77b9934c9b8 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 978.069440] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-874a89b8-6e4f-43e1-b544-984762d0350c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 978.082291] env[60722]: DEBUG nova.compute.provider_tree [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 978.090550] env[60722]: DEBUG nova.scheduler.client.report [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 978.102834] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.208s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 978.103292] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 978.131112] env[60722]: DEBUG nova.compute.utils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 978.132203] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 978.132369] env[60722]: DEBUG nova.network.neutron [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 978.140154] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 978.197781] env[60722]: DEBUG nova.policy [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5a70e6a9017043d5ab0b627f0a1423e8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '24a558c40311424d9c88a84256be240b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 978.202060] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 978.223676] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 978.223950] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 978.224144] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 978.224328] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 978.224467] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 978.224604] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 978.224801] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 978.224953] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 978.225137] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 978.225312] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 978.225482] env[60722]: DEBUG nova.virt.hardware [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 978.226308] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0672cb3f-7136-43cf-aaeb-d13fdf2a3343 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 978.233907] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f040554-b7fd-41fe-966d-f5bc81664ab1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 978.463928] env[60722]: DEBUG nova.network.neutron [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Successfully created port: 08fa4ab6-ed27-4b0e-ab90-f86f481d98ac {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 978.965097] env[60722]: DEBUG nova.network.neutron [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Successfully updated port: 08fa4ab6-ed27-4b0e-ab90-f86f481d98ac {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 978.976039] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquiring lock "refresh_cache-020c2b79-e755-4178-aa85-5ecaa31e7a9f" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 978.976146] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquired lock "refresh_cache-020c2b79-e755-4178-aa85-5ecaa31e7a9f" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 978.976245] env[60722]: DEBUG nova.network.neutron [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 979.011338] env[60722]: DEBUG nova.network.neutron [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 979.223472] env[60722]: DEBUG nova.network.neutron [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Updating instance_info_cache with network_info: [{"id": "08fa4ab6-ed27-4b0e-ab90-f86f481d98ac", "address": "fa:16:3e:5e:0d:f8", "network": {"id": "2fab50bd-0c37-4e7b-a49a-8a07f4f942a9", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-513956274-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "24a558c40311424d9c88a84256be240b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1fb81f98-6f5a-47ab-a512-27277591d064", "external-id": "nsx-vlan-transportzone-624", "segmentation_id": 624, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap08fa4ab6-ed", "ovs_interfaceid": "08fa4ab6-ed27-4b0e-ab90-f86f481d98ac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 979.234967] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Releasing lock "refresh_cache-020c2b79-e755-4178-aa85-5ecaa31e7a9f" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 979.234967] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Instance network_info: |[{"id": "08fa4ab6-ed27-4b0e-ab90-f86f481d98ac", "address": "fa:16:3e:5e:0d:f8", "network": {"id": "2fab50bd-0c37-4e7b-a49a-8a07f4f942a9", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-513956274-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "24a558c40311424d9c88a84256be240b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1fb81f98-6f5a-47ab-a512-27277591d064", "external-id": "nsx-vlan-transportzone-624", "segmentation_id": 624, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap08fa4ab6-ed", "ovs_interfaceid": "08fa4ab6-ed27-4b0e-ab90-f86f481d98ac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 979.235157] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5e:0d:f8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1fb81f98-6f5a-47ab-a512-27277591d064', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '08fa4ab6-ed27-4b0e-ab90-f86f481d98ac', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 979.242511] env[60722]: DEBUG oslo.service.loopingcall [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 979.242956] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 979.243180] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6d568ccc-867a-4b8e-8925-47345d4fdcc6 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 979.263449] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 979.263449] env[60722]: value = "task-565194" [ 979.263449] env[60722]: _type = "Task" [ 979.263449] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 979.271259] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565194, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 979.749973] env[60722]: DEBUG nova.compute.manager [req-e8086d52-a6f0-44dd-98f5-12e046ceceb9 req-5368a599-9e23-40ae-886a-50a0eeecf069 service nova] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Received event network-vif-plugged-08fa4ab6-ed27-4b0e-ab90-f86f481d98ac {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 979.749973] env[60722]: DEBUG oslo_concurrency.lockutils [req-e8086d52-a6f0-44dd-98f5-12e046ceceb9 req-5368a599-9e23-40ae-886a-50a0eeecf069 service nova] Acquiring lock "020c2b79-e755-4178-aa85-5ecaa31e7a9f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 979.750820] env[60722]: DEBUG oslo_concurrency.lockutils [req-e8086d52-a6f0-44dd-98f5-12e046ceceb9 req-5368a599-9e23-40ae-886a-50a0eeecf069 service nova] Lock "020c2b79-e755-4178-aa85-5ecaa31e7a9f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 979.750820] env[60722]: DEBUG oslo_concurrency.lockutils [req-e8086d52-a6f0-44dd-98f5-12e046ceceb9 req-5368a599-9e23-40ae-886a-50a0eeecf069 service nova] Lock "020c2b79-e755-4178-aa85-5ecaa31e7a9f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 979.750820] env[60722]: DEBUG nova.compute.manager [req-e8086d52-a6f0-44dd-98f5-12e046ceceb9 req-5368a599-9e23-40ae-886a-50a0eeecf069 service nova] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] No waiting events found dispatching network-vif-plugged-08fa4ab6-ed27-4b0e-ab90-f86f481d98ac {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 979.750820] env[60722]: WARNING nova.compute.manager [req-e8086d52-a6f0-44dd-98f5-12e046ceceb9 req-5368a599-9e23-40ae-886a-50a0eeecf069 service nova] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Received unexpected event network-vif-plugged-08fa4ab6-ed27-4b0e-ab90-f86f481d98ac for instance with vm_state building and task_state spawning. [ 979.750820] env[60722]: DEBUG nova.compute.manager [req-e8086d52-a6f0-44dd-98f5-12e046ceceb9 req-5368a599-9e23-40ae-886a-50a0eeecf069 service nova] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Received event network-changed-08fa4ab6-ed27-4b0e-ab90-f86f481d98ac {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 979.750971] env[60722]: DEBUG nova.compute.manager [req-e8086d52-a6f0-44dd-98f5-12e046ceceb9 req-5368a599-9e23-40ae-886a-50a0eeecf069 service nova] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Refreshing instance network info cache due to event network-changed-08fa4ab6-ed27-4b0e-ab90-f86f481d98ac. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 979.751088] env[60722]: DEBUG oslo_concurrency.lockutils [req-e8086d52-a6f0-44dd-98f5-12e046ceceb9 req-5368a599-9e23-40ae-886a-50a0eeecf069 service nova] Acquiring lock "refresh_cache-020c2b79-e755-4178-aa85-5ecaa31e7a9f" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 979.751229] env[60722]: DEBUG oslo_concurrency.lockutils [req-e8086d52-a6f0-44dd-98f5-12e046ceceb9 req-5368a599-9e23-40ae-886a-50a0eeecf069 service nova] Acquired lock "refresh_cache-020c2b79-e755-4178-aa85-5ecaa31e7a9f" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 979.751401] env[60722]: DEBUG nova.network.neutron [req-e8086d52-a6f0-44dd-98f5-12e046ceceb9 req-5368a599-9e23-40ae-886a-50a0eeecf069 service nova] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Refreshing network info cache for port 08fa4ab6-ed27-4b0e-ab90-f86f481d98ac {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 979.772635] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565194, 'name': CreateVM_Task, 'duration_secs': 0.30342} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 979.772834] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 979.773820] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 979.773943] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 979.774272] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 979.774499] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9a5f76f7-372f-4671-903d-ab6d28afb3a1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 979.778798] env[60722]: DEBUG oslo_vmware.api [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Waiting for the task: (returnval){ [ 979.778798] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]529ce776-a717-76d1-6ff3-73c3b2ada19a" [ 979.778798] env[60722]: _type = "Task" [ 979.778798] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 979.786168] env[60722]: DEBUG oslo_vmware.api [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]529ce776-a717-76d1-6ff3-73c3b2ada19a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 979.986610] env[60722]: DEBUG nova.network.neutron [req-e8086d52-a6f0-44dd-98f5-12e046ceceb9 req-5368a599-9e23-40ae-886a-50a0eeecf069 service nova] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Updated VIF entry in instance network info cache for port 08fa4ab6-ed27-4b0e-ab90-f86f481d98ac. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 979.986610] env[60722]: DEBUG nova.network.neutron [req-e8086d52-a6f0-44dd-98f5-12e046ceceb9 req-5368a599-9e23-40ae-886a-50a0eeecf069 service nova] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Updating instance_info_cache with network_info: [{"id": "08fa4ab6-ed27-4b0e-ab90-f86f481d98ac", "address": "fa:16:3e:5e:0d:f8", "network": {"id": "2fab50bd-0c37-4e7b-a49a-8a07f4f942a9", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-513956274-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "24a558c40311424d9c88a84256be240b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1fb81f98-6f5a-47ab-a512-27277591d064", "external-id": "nsx-vlan-transportzone-624", "segmentation_id": 624, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap08fa4ab6-ed", "ovs_interfaceid": "08fa4ab6-ed27-4b0e-ab90-f86f481d98ac", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 979.996145] env[60722]: DEBUG oslo_concurrency.lockutils [req-e8086d52-a6f0-44dd-98f5-12e046ceceb9 req-5368a599-9e23-40ae-886a-50a0eeecf069 service nova] Releasing lock "refresh_cache-020c2b79-e755-4178-aa85-5ecaa31e7a9f" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 980.289646] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 980.290372] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 980.290707] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 986.677976] env[60722]: DEBUG nova.compute.manager [req-f56bd0fb-c045-4ad3-b6ab-8dba7553d325 req-ee7a6dcc-8370-41bd-a96e-65b4b88e474b service nova] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Received event network-vif-deleted-08fa4ab6-ed27-4b0e-ab90-f86f481d98ac {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 987.136186] env[60722]: DEBUG nova.compute.manager [req-bf25829d-dd1a-4cfa-a3aa-857bed57ca21 req-d6dbcede-c24c-470c-9592-6e9f3cbdb0aa service nova] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Received event network-vif-deleted-0e53ccc1-3ce8-4448-af31-4820082696dc {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1018.945013] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1018.945259] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1018.945419] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1018.945563] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60722) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1021.940863] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1021.943456] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1021.943622] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1022.946047] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1022.946047] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1022.954941] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1022.955152] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1022.955312] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1022.955462] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60722) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1022.956493] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61fdcb5a-be7d-48e3-90f3-2216dbe779fa {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.967613] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f7931f1-f383-4b23-87aa-d45e3b229f7f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.981205] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efb59494-bc64-47d7-bb54-a90b4b1397cf {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.986993] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42e5dcf4-cde4-4677-9645-307dadf7385a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.015079] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181720MB free_disk=100GB free_vcpus=48 pci_devices=None {{(pid=60722) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1023.015208] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1023.015382] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1023.053542] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 65901b4a-42cf-4795-abc2-b0fea1f4fee7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1023.053688] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance e93b8d4b-6286-410a-870a-02fa7e59d90d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1023.063303] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 8d78f310-a2f2-4073-8371-afc42cc566f2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1023.072473] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 2d58b057-fec8-4c3c-bf83-452d27abfd38 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1023.072656] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1023.072799] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=832MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1023.127752] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb754b5a-72b4-4f21-9899-85d49e39ec0e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.135557] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c76d279-94f2-4441-8a93-7af21c74cfd9 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.166747] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33fad3e9-b305-425b-bfd6-9b607a37a370 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.173805] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1f0e5e0-bb87-429e-96ef-784922d5f637 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.186612] env[60722]: DEBUG nova.compute.provider_tree [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1023.194695] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1023.206751] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60722) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1023.206988] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.192s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1024.201570] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1024.944542] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1024.945844] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Starting heal instance info cache {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1024.945844] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Rebuilding the list of instances to heal {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1024.955895] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1024.956059] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1024.956190] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Didn't find any instances for network info cache update. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1026.088115] env[60722]: WARNING oslo_vmware.rw_handles [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1026.088115] env[60722]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1026.088115] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1026.088115] env[60722]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1026.088115] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1026.088115] env[60722]: ERROR oslo_vmware.rw_handles response.begin() [ 1026.088115] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1026.088115] env[60722]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1026.088115] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1026.088115] env[60722]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1026.088115] env[60722]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1026.088115] env[60722]: ERROR oslo_vmware.rw_handles [ 1026.088664] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Downloaded image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to vmware_temp/33be67a5-ff4c-433e-8e81-a0c39f2cd212/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1026.090346] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Caching image {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1026.090584] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Copying Virtual Disk [datastore1] vmware_temp/33be67a5-ff4c-433e-8e81-a0c39f2cd212/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk to [datastore1] vmware_temp/33be67a5-ff4c-433e-8e81-a0c39f2cd212/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk {{(pid=60722) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1026.090846] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4cb14f63-042c-4975-904c-54d2b29bdad3 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.098436] env[60722]: DEBUG oslo_vmware.api [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Waiting for the task: (returnval){ [ 1026.098436] env[60722]: value = "task-565195" [ 1026.098436] env[60722]: _type = "Task" [ 1026.098436] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1026.106261] env[60722]: DEBUG oslo_vmware.api [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Task: {'id': task-565195, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1026.608907] env[60722]: DEBUG oslo_vmware.exceptions [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Fault InvalidArgument not matched. {{(pid=60722) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1026.609110] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1026.609721] env[60722]: ERROR nova.compute.manager [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1026.609721] env[60722]: Faults: ['InvalidArgument'] [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Traceback (most recent call last): [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] yield resources [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] self.driver.spawn(context, instance, image_meta, [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] self._fetch_image_if_missing(context, vi) [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] image_cache(vi, tmp_image_ds_loc) [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] vm_util.copy_virtual_disk( [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] session._wait_for_task(vmdk_copy_task) [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] return self.wait_for_task(task_ref) [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] return evt.wait() [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] result = hub.switch() [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] return self.greenlet.switch() [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] self.f(*self.args, **self.kw) [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] raise exceptions.translate_fault(task_info.error) [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Faults: ['InvalidArgument'] [ 1026.609721] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] [ 1026.610639] env[60722]: INFO nova.compute.manager [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Terminating instance [ 1026.611555] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1026.611751] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1026.611973] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1b6cb947-3257-4ae8-ae95-f9146ca6ca73 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.614294] env[60722]: DEBUG nova.compute.manager [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1026.614477] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1026.615182] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5a94c97-4046-4078-9946-330957061d6c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.621540] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1026.621726] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-37ad80a2-adc4-4321-82f5-a3b3516c77cb {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.623764] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1026.623934] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1026.624895] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6b41f58d-3650-43b7-99af-6f052aa49eeb {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.629565] env[60722]: DEBUG oslo_vmware.api [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Waiting for the task: (returnval){ [ 1026.629565] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]5245e8ac-e857-27c9-6627-4fd033f702c2" [ 1026.629565] env[60722]: _type = "Task" [ 1026.629565] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1026.636201] env[60722]: DEBUG oslo_vmware.api [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]5245e8ac-e857-27c9-6627-4fd033f702c2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1026.683309] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1026.683491] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1026.683653] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Deleting the datastore file [datastore1] e93b8d4b-6286-410a-870a-02fa7e59d90d {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1026.683890] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f36870d7-4521-450b-bf42-eda82695692d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.689949] env[60722]: DEBUG oslo_vmware.api [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Waiting for the task: (returnval){ [ 1026.689949] env[60722]: value = "task-565197" [ 1026.689949] env[60722]: _type = "Task" [ 1026.689949] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1026.696998] env[60722]: DEBUG oslo_vmware.api [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Task: {'id': task-565197, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1027.140179] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1027.140496] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Creating directory with path [datastore1] vmware_temp/0c329e0d-c1bd-4c4d-9c98-085b7f121b34/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1027.140619] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-00612b3c-1e04-4f53-a73a-0ff2a2473a3c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.151196] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Created directory with path [datastore1] vmware_temp/0c329e0d-c1bd-4c4d-9c98-085b7f121b34/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1027.151368] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Fetch image to [datastore1] vmware_temp/0c329e0d-c1bd-4c4d-9c98-085b7f121b34/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1027.151528] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/0c329e0d-c1bd-4c4d-9c98-085b7f121b34/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1027.152202] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-843763d7-50bb-4fd8-96fc-c3417f786ba3 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.158249] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b796a33a-60d7-4d17-89c3-ca8cf278595d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.166522] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44a64cef-ab35-41b4-bcd6-2d509f4b3916 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.198110] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26a5e34e-0d90-40a6-af45-e561b53a1c8f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.205519] env[60722]: DEBUG oslo_vmware.api [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Task: {'id': task-565197, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.071273} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1027.205964] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1027.206154] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1027.206318] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1027.206484] env[60722]: INFO nova.compute.manager [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1027.207918] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-37ce00fd-226f-4f63-8512-6792a2b96dcf {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.209769] env[60722]: DEBUG nova.compute.claims [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1027.209932] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1027.210157] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1027.294357] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1027.298539] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3527880-9b5c-4570-9c6d-376c621c5949 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.305079] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-619e2d4c-891f-4fbb-bde6-b9be7a1dd885 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.339651] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb8a72ac-1670-4d4d-856c-a1d002dfce82 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.346366] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-406c92e2-8fbe-43e7-902f-1616d0e95524 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.359011] env[60722]: DEBUG nova.compute.provider_tree [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1027.367988] env[60722]: DEBUG nova.scheduler.client.report [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1027.374596] env[60722]: DEBUG oslo_vmware.rw_handles [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0c329e0d-c1bd-4c4d-9c98-085b7f121b34/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1027.425550] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.215s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1027.426111] env[60722]: ERROR nova.compute.manager [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1027.426111] env[60722]: Faults: ['InvalidArgument'] [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Traceback (most recent call last): [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] self.driver.spawn(context, instance, image_meta, [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] self._fetch_image_if_missing(context, vi) [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] image_cache(vi, tmp_image_ds_loc) [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] vm_util.copy_virtual_disk( [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] session._wait_for_task(vmdk_copy_task) [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] return self.wait_for_task(task_ref) [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] return evt.wait() [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] result = hub.switch() [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] return self.greenlet.switch() [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] self.f(*self.args, **self.kw) [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] raise exceptions.translate_fault(task_info.error) [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Faults: ['InvalidArgument'] [ 1027.426111] env[60722]: ERROR nova.compute.manager [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] [ 1027.426799] env[60722]: DEBUG nova.compute.utils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] VimFaultException {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1027.429122] env[60722]: DEBUG nova.compute.manager [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Build of instance e93b8d4b-6286-410a-870a-02fa7e59d90d was re-scheduled: A specified parameter was not correct: fileType [ 1027.429122] env[60722]: Faults: ['InvalidArgument'] {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1027.429480] env[60722]: DEBUG nova.compute.manager [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1027.429645] env[60722]: DEBUG nova.compute.manager [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1027.429806] env[60722]: DEBUG nova.compute.manager [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1027.429961] env[60722]: DEBUG nova.network.neutron [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1027.431890] env[60722]: DEBUG oslo_vmware.rw_handles [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Completed reading data from the image iterator. {{(pid=60722) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1027.432059] env[60722]: DEBUG oslo_vmware.rw_handles [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0c329e0d-c1bd-4c4d-9c98-085b7f121b34/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1027.672375] env[60722]: DEBUG nova.network.neutron [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1027.680948] env[60722]: INFO nova.compute.manager [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Took 0.25 seconds to deallocate network for instance. [ 1027.767283] env[60722]: INFO nova.scheduler.client.report [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Deleted allocations for instance e93b8d4b-6286-410a-870a-02fa7e59d90d [ 1027.786586] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e2168e6e-9018-4de5-bf1f-121986ca947f tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "e93b8d4b-6286-410a-870a-02fa7e59d90d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 429.584s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1027.787643] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1007fb51-4fc1-495b-8b48-dd671d8b6373 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "e93b8d4b-6286-410a-870a-02fa7e59d90d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 229.073s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1027.787865] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1007fb51-4fc1-495b-8b48-dd671d8b6373 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Acquiring lock "e93b8d4b-6286-410a-870a-02fa7e59d90d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1027.788081] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1007fb51-4fc1-495b-8b48-dd671d8b6373 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "e93b8d4b-6286-410a-870a-02fa7e59d90d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1027.788283] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1007fb51-4fc1-495b-8b48-dd671d8b6373 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "e93b8d4b-6286-410a-870a-02fa7e59d90d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1027.790082] env[60722]: INFO nova.compute.manager [None req-1007fb51-4fc1-495b-8b48-dd671d8b6373 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Terminating instance [ 1027.792093] env[60722]: DEBUG nova.compute.manager [None req-1007fb51-4fc1-495b-8b48-dd671d8b6373 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1027.792284] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-1007fb51-4fc1-495b-8b48-dd671d8b6373 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1027.792555] env[60722]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1d00f361-579a-4116-8cf1-6048461d5289 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.803020] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c7724cd-c2ff-457d-8756-47f859ec0375 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.813730] env[60722]: DEBUG nova.compute.manager [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1027.832601] env[60722]: WARNING nova.virt.vmwareapi.vmops [None req-1007fb51-4fc1-495b-8b48-dd671d8b6373 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e93b8d4b-6286-410a-870a-02fa7e59d90d could not be found. [ 1027.832786] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-1007fb51-4fc1-495b-8b48-dd671d8b6373 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1027.832998] env[60722]: INFO nova.compute.manager [None req-1007fb51-4fc1-495b-8b48-dd671d8b6373 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1027.833251] env[60722]: DEBUG oslo.service.loopingcall [None req-1007fb51-4fc1-495b-8b48-dd671d8b6373 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1027.833456] env[60722]: DEBUG nova.compute.manager [-] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1027.833551] env[60722]: DEBUG nova.network.neutron [-] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1027.857279] env[60722]: DEBUG nova.network.neutron [-] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1027.860128] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1027.860348] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1027.861738] env[60722]: INFO nova.compute.claims [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1027.865839] env[60722]: INFO nova.compute.manager [-] [instance: e93b8d4b-6286-410a-870a-02fa7e59d90d] Took 0.03 seconds to deallocate network for instance. [ 1027.944638] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1007fb51-4fc1-495b-8b48-dd671d8b6373 tempest-ListServerFiltersTestJSON-1479885286 tempest-ListServerFiltersTestJSON-1479885286-project-member] Lock "e93b8d4b-6286-410a-870a-02fa7e59d90d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.157s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1027.947979] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7592bc23-85a2-4de7-bdf5-6f00c5534b19 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.955946] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5dd70e35-9377-43db-a4b3-f8933b28b79d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.985126] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a79eb57c-5741-4cac-99b6-5fc0792ce647 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.992334] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-037fcdb2-6682-4bcc-81fd-8813cc187e59 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.005168] env[60722]: DEBUG nova.compute.provider_tree [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1028.013328] env[60722]: DEBUG nova.scheduler.client.report [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1028.025078] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.165s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1028.025539] env[60722]: DEBUG nova.compute.manager [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1028.057144] env[60722]: DEBUG nova.compute.utils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1028.058523] env[60722]: DEBUG nova.compute.manager [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1028.058762] env[60722]: DEBUG nova.network.neutron [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1028.066126] env[60722]: DEBUG nova.compute.manager [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1028.111518] env[60722]: DEBUG nova.policy [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '39a47bfac26a479d91d2b2a8fd304a03', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cb6869361691457bb10a943f78202a7d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 1028.125595] env[60722]: DEBUG nova.compute.manager [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1028.146153] env[60722]: DEBUG nova.virt.hardware [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1028.146470] env[60722]: DEBUG nova.virt.hardware [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1028.146522] env[60722]: DEBUG nova.virt.hardware [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1028.146694] env[60722]: DEBUG nova.virt.hardware [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1028.146832] env[60722]: DEBUG nova.virt.hardware [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1028.146971] env[60722]: DEBUG nova.virt.hardware [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1028.147182] env[60722]: DEBUG nova.virt.hardware [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1028.147332] env[60722]: DEBUG nova.virt.hardware [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1028.147489] env[60722]: DEBUG nova.virt.hardware [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1028.147644] env[60722]: DEBUG nova.virt.hardware [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1028.147806] env[60722]: DEBUG nova.virt.hardware [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1028.148638] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52da4b90-afdb-4c54-9cb4-70f3a9612340 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.156562] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab41e82f-d7e8-4ad5-ba92-3da62fd9264f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.380329] env[60722]: DEBUG nova.network.neutron [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Successfully created port: 2d83902a-f312-4c9b-8f37-3857c1c8e091 {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1028.868434] env[60722]: DEBUG nova.network.neutron [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Successfully updated port: 2d83902a-f312-4c9b-8f37-3857c1c8e091 {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1028.878486] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Acquiring lock "refresh_cache-8d78f310-a2f2-4073-8371-afc42cc566f2" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1028.878631] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Acquired lock "refresh_cache-8d78f310-a2f2-4073-8371-afc42cc566f2" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1028.878776] env[60722]: DEBUG nova.network.neutron [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1028.911065] env[60722]: DEBUG nova.network.neutron [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1029.055733] env[60722]: DEBUG nova.network.neutron [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Updating instance_info_cache with network_info: [{"id": "2d83902a-f312-4c9b-8f37-3857c1c8e091", "address": "fa:16:3e:40:6d:11", "network": {"id": "15b590ec-3e68-43c1-971f-8b0a8eca22e6", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-168401077-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cb6869361691457bb10a943f78202a7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "510d3c47-3615-43d5-aa5d-a279fd915e71", "external-id": "nsx-vlan-transportzone-436", "segmentation_id": 436, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2d83902a-f3", "ovs_interfaceid": "2d83902a-f312-4c9b-8f37-3857c1c8e091", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1029.066156] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Releasing lock "refresh_cache-8d78f310-a2f2-4073-8371-afc42cc566f2" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1029.066417] env[60722]: DEBUG nova.compute.manager [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Instance network_info: |[{"id": "2d83902a-f312-4c9b-8f37-3857c1c8e091", "address": "fa:16:3e:40:6d:11", "network": {"id": "15b590ec-3e68-43c1-971f-8b0a8eca22e6", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-168401077-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cb6869361691457bb10a943f78202a7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "510d3c47-3615-43d5-aa5d-a279fd915e71", "external-id": "nsx-vlan-transportzone-436", "segmentation_id": 436, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2d83902a-f3", "ovs_interfaceid": "2d83902a-f312-4c9b-8f37-3857c1c8e091", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1029.066758] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:40:6d:11', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '510d3c47-3615-43d5-aa5d-a279fd915e71', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2d83902a-f312-4c9b-8f37-3857c1c8e091', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1029.074149] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Creating folder: Project (cb6869361691457bb10a943f78202a7d). Parent ref: group-v141606. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1029.074580] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c04bada3-518a-4958-866b-be547fdf23c8 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.085709] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Created folder: Project (cb6869361691457bb10a943f78202a7d) in parent group-v141606. [ 1029.085837] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Creating folder: Instances. Parent ref: group-v141657. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1029.086046] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-24723b25-4b3e-48af-ac01-31e10bf34268 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.095301] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Created folder: Instances in parent group-v141657. [ 1029.095509] env[60722]: DEBUG oslo.service.loopingcall [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1029.095668] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1029.095840] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f5eaffaf-f3f4-4929-906b-a50b63f8975c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.114438] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1029.114438] env[60722]: value = "task-565200" [ 1029.114438] env[60722]: _type = "Task" [ 1029.114438] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1029.121357] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565200, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1029.623732] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565200, 'name': CreateVM_Task, 'duration_secs': 0.28611} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1029.624030] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1029.624527] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1029.624680] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1029.625026] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1029.625259] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a65883a6-204b-4095-b32b-d188b02f64cc {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.630460] env[60722]: DEBUG oslo_vmware.api [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Waiting for the task: (returnval){ [ 1029.630460] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52cf54cb-8da7-a353-681e-315f3454318a" [ 1029.630460] env[60722]: _type = "Task" [ 1029.630460] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1029.637876] env[60722]: DEBUG oslo_vmware.api [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52cf54cb-8da7-a353-681e-315f3454318a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1029.708895] env[60722]: DEBUG nova.compute.manager [req-a2e33e1e-3314-45f9-aa29-52f873e8785e req-6aa61e02-c0a8-4997-8117-cd2968ce3f3e service nova] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Received event network-vif-plugged-2d83902a-f312-4c9b-8f37-3857c1c8e091 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1029.709093] env[60722]: DEBUG oslo_concurrency.lockutils [req-a2e33e1e-3314-45f9-aa29-52f873e8785e req-6aa61e02-c0a8-4997-8117-cd2968ce3f3e service nova] Acquiring lock "8d78f310-a2f2-4073-8371-afc42cc566f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1029.709303] env[60722]: DEBUG oslo_concurrency.lockutils [req-a2e33e1e-3314-45f9-aa29-52f873e8785e req-6aa61e02-c0a8-4997-8117-cd2968ce3f3e service nova] Lock "8d78f310-a2f2-4073-8371-afc42cc566f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1029.709414] env[60722]: DEBUG oslo_concurrency.lockutils [req-a2e33e1e-3314-45f9-aa29-52f873e8785e req-6aa61e02-c0a8-4997-8117-cd2968ce3f3e service nova] Lock "8d78f310-a2f2-4073-8371-afc42cc566f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1029.709574] env[60722]: DEBUG nova.compute.manager [req-a2e33e1e-3314-45f9-aa29-52f873e8785e req-6aa61e02-c0a8-4997-8117-cd2968ce3f3e service nova] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] No waiting events found dispatching network-vif-plugged-2d83902a-f312-4c9b-8f37-3857c1c8e091 {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1029.709728] env[60722]: WARNING nova.compute.manager [req-a2e33e1e-3314-45f9-aa29-52f873e8785e req-6aa61e02-c0a8-4997-8117-cd2968ce3f3e service nova] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Received unexpected event network-vif-plugged-2d83902a-f312-4c9b-8f37-3857c1c8e091 for instance with vm_state building and task_state spawning. [ 1029.709880] env[60722]: DEBUG nova.compute.manager [req-a2e33e1e-3314-45f9-aa29-52f873e8785e req-6aa61e02-c0a8-4997-8117-cd2968ce3f3e service nova] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Received event network-changed-2d83902a-f312-4c9b-8f37-3857c1c8e091 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1029.710037] env[60722]: DEBUG nova.compute.manager [req-a2e33e1e-3314-45f9-aa29-52f873e8785e req-6aa61e02-c0a8-4997-8117-cd2968ce3f3e service nova] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Refreshing instance network info cache due to event network-changed-2d83902a-f312-4c9b-8f37-3857c1c8e091. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 1029.710399] env[60722]: DEBUG oslo_concurrency.lockutils [req-a2e33e1e-3314-45f9-aa29-52f873e8785e req-6aa61e02-c0a8-4997-8117-cd2968ce3f3e service nova] Acquiring lock "refresh_cache-8d78f310-a2f2-4073-8371-afc42cc566f2" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1029.710534] env[60722]: DEBUG oslo_concurrency.lockutils [req-a2e33e1e-3314-45f9-aa29-52f873e8785e req-6aa61e02-c0a8-4997-8117-cd2968ce3f3e service nova] Acquired lock "refresh_cache-8d78f310-a2f2-4073-8371-afc42cc566f2" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1029.710685] env[60722]: DEBUG nova.network.neutron [req-a2e33e1e-3314-45f9-aa29-52f873e8785e req-6aa61e02-c0a8-4997-8117-cd2968ce3f3e service nova] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Refreshing network info cache for port 2d83902a-f312-4c9b-8f37-3857c1c8e091 {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1030.109940] env[60722]: DEBUG nova.network.neutron [req-a2e33e1e-3314-45f9-aa29-52f873e8785e req-6aa61e02-c0a8-4997-8117-cd2968ce3f3e service nova] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Updated VIF entry in instance network info cache for port 2d83902a-f312-4c9b-8f37-3857c1c8e091. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1030.110320] env[60722]: DEBUG nova.network.neutron [req-a2e33e1e-3314-45f9-aa29-52f873e8785e req-6aa61e02-c0a8-4997-8117-cd2968ce3f3e service nova] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Updating instance_info_cache with network_info: [{"id": "2d83902a-f312-4c9b-8f37-3857c1c8e091", "address": "fa:16:3e:40:6d:11", "network": {"id": "15b590ec-3e68-43c1-971f-8b0a8eca22e6", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-168401077-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cb6869361691457bb10a943f78202a7d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "510d3c47-3615-43d5-aa5d-a279fd915e71", "external-id": "nsx-vlan-transportzone-436", "segmentation_id": 436, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2d83902a-f3", "ovs_interfaceid": "2d83902a-f312-4c9b-8f37-3857c1c8e091", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1030.119116] env[60722]: DEBUG oslo_concurrency.lockutils [req-a2e33e1e-3314-45f9-aa29-52f873e8785e req-6aa61e02-c0a8-4997-8117-cd2968ce3f3e service nova] Releasing lock "refresh_cache-8d78f310-a2f2-4073-8371-afc42cc566f2" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1030.140496] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1030.140717] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1030.140917] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1075.374725] env[60722]: WARNING oslo_vmware.rw_handles [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1075.374725] env[60722]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1075.374725] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1075.374725] env[60722]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1075.374725] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1075.374725] env[60722]: ERROR oslo_vmware.rw_handles response.begin() [ 1075.374725] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1075.374725] env[60722]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1075.374725] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1075.374725] env[60722]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1075.374725] env[60722]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1075.374725] env[60722]: ERROR oslo_vmware.rw_handles [ 1075.375444] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Downloaded image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to vmware_temp/0c329e0d-c1bd-4c4d-9c98-085b7f121b34/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1075.377550] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Caching image {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1075.377852] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Copying Virtual Disk [datastore1] vmware_temp/0c329e0d-c1bd-4c4d-9c98-085b7f121b34/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk to [datastore1] vmware_temp/0c329e0d-c1bd-4c4d-9c98-085b7f121b34/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk {{(pid=60722) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1075.378216] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-db69d168-8258-41b4-9242-9001d9147f0c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1075.387906] env[60722]: DEBUG oslo_vmware.api [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Waiting for the task: (returnval){ [ 1075.387906] env[60722]: value = "task-565201" [ 1075.387906] env[60722]: _type = "Task" [ 1075.387906] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1075.395289] env[60722]: DEBUG oslo_vmware.api [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Task: {'id': task-565201, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1075.898259] env[60722]: DEBUG oslo_vmware.exceptions [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Fault InvalidArgument not matched. {{(pid=60722) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1075.898420] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1075.898963] env[60722]: ERROR nova.compute.manager [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1075.898963] env[60722]: Faults: ['InvalidArgument'] [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Traceback (most recent call last): [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] yield resources [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] self.driver.spawn(context, instance, image_meta, [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] self._fetch_image_if_missing(context, vi) [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] image_cache(vi, tmp_image_ds_loc) [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] vm_util.copy_virtual_disk( [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] session._wait_for_task(vmdk_copy_task) [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] return self.wait_for_task(task_ref) [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] return evt.wait() [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] result = hub.switch() [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] return self.greenlet.switch() [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] self.f(*self.args, **self.kw) [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] raise exceptions.translate_fault(task_info.error) [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Faults: ['InvalidArgument'] [ 1075.898963] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] [ 1075.899951] env[60722]: INFO nova.compute.manager [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Terminating instance [ 1075.900790] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1075.901017] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1075.901277] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9779b520-16a6-4c29-ad87-d6043268084d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1075.903533] env[60722]: DEBUG nova.compute.manager [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1075.903717] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1075.904443] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0b56853-4f07-47b0-8c7c-6b04970a9658 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1075.910978] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1075.911180] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d51ab77f-3d12-49af-bf7c-89641f769714 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1075.913226] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1075.913388] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1075.914284] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-318b1a1b-3f80-4e7d-98d9-7f6911820f36 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1075.918882] env[60722]: DEBUG oslo_vmware.api [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Waiting for the task: (returnval){ [ 1075.918882] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]525d7ff8-44f4-c55e-250b-ecf73c843048" [ 1075.918882] env[60722]: _type = "Task" [ 1075.918882] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1075.926458] env[60722]: DEBUG oslo_vmware.api [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]525d7ff8-44f4-c55e-250b-ecf73c843048, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1075.979744] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1075.979963] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1075.980183] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Deleting the datastore file [datastore1] 65901b4a-42cf-4795-abc2-b0fea1f4fee7 {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1075.980419] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2312b6c7-83c5-4908-8f42-be8c27d2deb8 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1075.988010] env[60722]: DEBUG oslo_vmware.api [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Waiting for the task: (returnval){ [ 1075.988010] env[60722]: value = "task-565203" [ 1075.988010] env[60722]: _type = "Task" [ 1075.988010] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1075.995410] env[60722]: DEBUG oslo_vmware.api [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Task: {'id': task-565203, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1076.429558] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1076.429977] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Creating directory with path [datastore1] vmware_temp/419c8088-4a80-49ab-ba14-8fc48b228a91/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1076.429977] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b52b6e3f-a021-40c6-99d7-946f22c3d6c7 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1076.440986] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Created directory with path [datastore1] vmware_temp/419c8088-4a80-49ab-ba14-8fc48b228a91/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1076.441181] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Fetch image to [datastore1] vmware_temp/419c8088-4a80-49ab-ba14-8fc48b228a91/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1076.441347] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/419c8088-4a80-49ab-ba14-8fc48b228a91/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1076.442017] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61b34fa5-7c31-4c11-95f5-55fcccd20b64 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1076.448486] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cb4606e-6bfb-4f82-94ee-124b3280e942 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1076.457152] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ffbba08-cc2a-4ec6-8538-1d3b74985096 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1076.487502] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22575a22-75e7-4d67-92f1-7624befd66c9 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1076.497885] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9077aa54-253d-4a5f-aa44-2c9a200f374f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1076.499472] env[60722]: DEBUG oslo_vmware.api [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Task: {'id': task-565203, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084272} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1076.499692] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1076.499860] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1076.500032] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1076.500201] env[60722]: INFO nova.compute.manager [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1076.502223] env[60722]: DEBUG nova.compute.claims [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1076.502389] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1076.502592] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1076.520694] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1076.584249] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-721cf47c-26a3-40b0-9065-4cddcabb3089 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1076.591399] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e19d8ed4-a551-423c-aec6-90a516918ce2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1076.621931] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41cde1e6-490a-4e98-8df0-8ccfa1bf9bdc {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1076.629029] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b56e92e3-2bbb-42f2-8229-632e1c242213 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1076.642037] env[60722]: DEBUG nova.compute.provider_tree [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1076.650247] env[60722]: DEBUG nova.scheduler.client.report [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1076.662893] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.160s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1076.663418] env[60722]: ERROR nova.compute.manager [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1076.663418] env[60722]: Faults: ['InvalidArgument'] [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Traceback (most recent call last): [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] self.driver.spawn(context, instance, image_meta, [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] self._fetch_image_if_missing(context, vi) [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] image_cache(vi, tmp_image_ds_loc) [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] vm_util.copy_virtual_disk( [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] session._wait_for_task(vmdk_copy_task) [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] return self.wait_for_task(task_ref) [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] return evt.wait() [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] result = hub.switch() [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] return self.greenlet.switch() [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] self.f(*self.args, **self.kw) [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] raise exceptions.translate_fault(task_info.error) [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Faults: ['InvalidArgument'] [ 1076.663418] env[60722]: ERROR nova.compute.manager [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] [ 1076.664446] env[60722]: DEBUG nova.compute.utils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] VimFaultException {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1076.665514] env[60722]: DEBUG nova.compute.manager [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Build of instance 65901b4a-42cf-4795-abc2-b0fea1f4fee7 was re-scheduled: A specified parameter was not correct: fileType [ 1076.665514] env[60722]: Faults: ['InvalidArgument'] {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1076.665870] env[60722]: DEBUG nova.compute.manager [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1076.666055] env[60722]: DEBUG nova.compute.manager [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1076.666206] env[60722]: DEBUG nova.compute.manager [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1076.666361] env[60722]: DEBUG nova.network.neutron [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1076.748667] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1076.750164] env[60722]: ERROR nova.compute.manager [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Traceback (most recent call last): [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] result = getattr(controller, method)(*args, **kwargs) [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return self._get(image_id) [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] resp, body = self.http_client.get(url, headers=header) [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return self.request(url, 'GET', **kwargs) [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return self._handle_response(resp) [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] raise exc.from_response(resp, resp.content) [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] During handling of the above exception, another exception occurred: [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Traceback (most recent call last): [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] yield resources [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] self.driver.spawn(context, instance, image_meta, [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] self._fetch_image_if_missing(context, vi) [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] image_fetch(context, vi, tmp_image_ds_loc) [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] images.fetch_image( [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1076.750164] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] metadata = IMAGE_API.get(context, image_ref) [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return session.show(context, image_id, [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] _reraise_translated_image_exception(image_id) [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] raise new_exc.with_traceback(exc_trace) [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] result = getattr(controller, method)(*args, **kwargs) [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return self._get(image_id) [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] resp, body = self.http_client.get(url, headers=header) [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return self.request(url, 'GET', **kwargs) [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return self._handle_response(resp) [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] raise exc.from_response(resp, resp.content) [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1076.751320] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1076.751320] env[60722]: INFO nova.compute.manager [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Terminating instance [ 1076.752171] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1076.752298] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1076.752434] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d9979ca3-4514-4d29-b6e6-ca6a8a5ab1bb {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1076.754754] env[60722]: DEBUG nova.compute.manager [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1076.754936] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1076.755734] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72d619b7-0710-4e6f-97bc-7fb981e8d14d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1076.763301] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1076.763504] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-41b595eb-2b1f-432d-8416-ee3aa164471f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1076.765598] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1076.765761] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1076.766695] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0f6c0bd2-d100-43af-9cfa-a0d8200e690e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1076.771510] env[60722]: DEBUG oslo_vmware.api [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Waiting for the task: (returnval){ [ 1076.771510] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]527b1c3c-3531-f611-5bd3-8b10171a48dc" [ 1076.771510] env[60722]: _type = "Task" [ 1076.771510] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1076.778980] env[60722]: DEBUG oslo_vmware.api [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]527b1c3c-3531-f611-5bd3-8b10171a48dc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1076.843282] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1076.843513] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1076.843655] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Deleting the datastore file [datastore1] eae8d9ce-9fe3-411e-9fd8-05920fb0af04 {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1076.843932] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d258516d-217d-46c6-90e6-ab38e259740d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1076.850423] env[60722]: DEBUG oslo_vmware.api [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Waiting for the task: (returnval){ [ 1076.850423] env[60722]: value = "task-565205" [ 1076.850423] env[60722]: _type = "Task" [ 1076.850423] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1076.858373] env[60722]: DEBUG oslo_vmware.api [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Task: {'id': task-565205, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1077.028299] env[60722]: DEBUG nova.network.neutron [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1077.044076] env[60722]: INFO nova.compute.manager [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Took 0.38 seconds to deallocate network for instance. [ 1077.130379] env[60722]: INFO nova.scheduler.client.report [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Deleted allocations for instance 65901b4a-42cf-4795-abc2-b0fea1f4fee7 [ 1077.145537] env[60722]: DEBUG oslo_concurrency.lockutils [None req-bb9519a6-267c-435d-b26a-bb818eb22a94 tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Lock "65901b4a-42cf-4795-abc2-b0fea1f4fee7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 485.595s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1077.146634] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "65901b4a-42cf-4795-abc2-b0fea1f4fee7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 484.965s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1077.146818] env[60722]: INFO nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] During sync_power_state the instance has a pending task (spawning). Skip. [ 1077.146981] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "65901b4a-42cf-4795-abc2-b0fea1f4fee7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1077.147558] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3e653e97-a08a-4410-b698-46ab1449ebdf tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Lock "65901b4a-42cf-4795-abc2-b0fea1f4fee7" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 285.884s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1077.147780] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3e653e97-a08a-4410-b698-46ab1449ebdf tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Acquiring lock "65901b4a-42cf-4795-abc2-b0fea1f4fee7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1077.147982] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3e653e97-a08a-4410-b698-46ab1449ebdf tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Lock "65901b4a-42cf-4795-abc2-b0fea1f4fee7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1077.148156] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3e653e97-a08a-4410-b698-46ab1449ebdf tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Lock "65901b4a-42cf-4795-abc2-b0fea1f4fee7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1077.150883] env[60722]: INFO nova.compute.manager [None req-3e653e97-a08a-4410-b698-46ab1449ebdf tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Terminating instance [ 1077.152294] env[60722]: DEBUG nova.compute.manager [None req-3e653e97-a08a-4410-b698-46ab1449ebdf tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1077.152846] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-3e653e97-a08a-4410-b698-46ab1449ebdf tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1077.152846] env[60722]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2a263015-93b7-454d-847b-3733e7c85baf {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.161747] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e964c2a-6e1f-4f14-8341-42d0b4bfd608 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.172073] env[60722]: DEBUG nova.compute.manager [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1077.190557] env[60722]: WARNING nova.virt.vmwareapi.vmops [None req-3e653e97-a08a-4410-b698-46ab1449ebdf tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 65901b4a-42cf-4795-abc2-b0fea1f4fee7 could not be found. [ 1077.190755] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-3e653e97-a08a-4410-b698-46ab1449ebdf tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1077.190948] env[60722]: INFO nova.compute.manager [None req-3e653e97-a08a-4410-b698-46ab1449ebdf tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1077.191774] env[60722]: DEBUG oslo.service.loopingcall [None req-3e653e97-a08a-4410-b698-46ab1449ebdf tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1077.191774] env[60722]: DEBUG nova.compute.manager [-] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1077.191774] env[60722]: DEBUG nova.network.neutron [-] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1077.216691] env[60722]: DEBUG nova.network.neutron [-] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1077.223787] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1077.224136] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1077.226015] env[60722]: INFO nova.compute.claims [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1077.228938] env[60722]: INFO nova.compute.manager [-] [instance: 65901b4a-42cf-4795-abc2-b0fea1f4fee7] Took 0.04 seconds to deallocate network for instance. [ 1077.284879] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1077.285134] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Creating directory with path [datastore1] vmware_temp/63d0749d-be0d-4219-9528-ce51a70f237f/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1077.285573] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f89dfe3a-1c7e-4f38-b0be-67e196ba6115 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.297495] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Created directory with path [datastore1] vmware_temp/63d0749d-be0d-4219-9528-ce51a70f237f/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1077.297694] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Fetch image to [datastore1] vmware_temp/63d0749d-be0d-4219-9528-ce51a70f237f/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1077.297836] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/63d0749d-be0d-4219-9528-ce51a70f237f/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1077.298563] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9fab8c7-9549-4c71-b1d0-7a768d4d855e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.307850] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3d73c5c-08ee-4a06-8190-b77c2d98cf86 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.310648] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ddbaf58-f769-4650-a13e-7f043b34a2a4 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.320907] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66daebec-86df-44fc-aa99-2a17d1fa1d37 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.325200] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1a4b353-ea32-41e6-b39a-bcb84bc04b08 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.380517] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6dbdb1b-e691-4d39-bf68-1974677da954 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.385255] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3e653e97-a08a-4410-b698-46ab1449ebdf tempest-ServerDiagnosticsNegativeTest-1829286237 tempest-ServerDiagnosticsNegativeTest-1829286237-project-member] Lock "65901b4a-42cf-4795-abc2-b0fea1f4fee7" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.238s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1077.386464] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e44fcab-0707-49a3-9f66-5fe27e5425a0 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.395758] env[60722]: DEBUG oslo_vmware.api [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Task: {'id': task-565205, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.062302} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1077.397417] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-77c84644-6bf7-48e1-a9de-f56ed46d1b5a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.399387] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1077.399579] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1077.399749] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1077.399922] env[60722]: INFO nova.compute.manager [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1077.402401] env[60722]: DEBUG nova.compute.claims [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1077.402563] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1077.405970] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df4e2cec-bc6c-478a-beae-56ab7bc3278e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.418857] env[60722]: DEBUG nova.compute.provider_tree [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1077.428641] env[60722]: DEBUG nova.scheduler.client.report [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1077.441767] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.218s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1077.442256] env[60722]: DEBUG nova.compute.manager [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1077.444647] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.042s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1077.469348] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1077.470062] env[60722]: DEBUG nova.compute.utils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Instance eae8d9ce-9fe3-411e-9fd8-05920fb0af04 could not be found. {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1077.471415] env[60722]: DEBUG nova.compute.manager [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Instance disappeared during build. {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1077.471580] env[60722]: DEBUG nova.compute.manager [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1077.471735] env[60722]: DEBUG nova.compute.manager [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1077.471896] env[60722]: DEBUG nova.compute.manager [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1077.472134] env[60722]: DEBUG nova.network.neutron [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1077.475260] env[60722]: DEBUG nova.compute.utils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1077.476936] env[60722]: DEBUG nova.compute.manager [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1077.477152] env[60722]: DEBUG nova.network.neutron [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1077.484637] env[60722]: DEBUG nova.compute.manager [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1077.488644] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1077.521484] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1077.522657] env[60722]: ERROR nova.compute.manager [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Traceback (most recent call last): [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] result = getattr(controller, method)(*args, **kwargs) [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return self._get(image_id) [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] resp, body = self.http_client.get(url, headers=header) [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return self.request(url, 'GET', **kwargs) [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return self._handle_response(resp) [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] raise exc.from_response(resp, resp.content) [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] During handling of the above exception, another exception occurred: [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Traceback (most recent call last): [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] yield resources [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] self.driver.spawn(context, instance, image_meta, [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] self._fetch_image_if_missing(context, vi) [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] image_fetch(context, vi, tmp_image_ds_loc) [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] images.fetch_image( [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1077.522657] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] metadata = IMAGE_API.get(context, image_ref) [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return session.show(context, image_id, [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] _reraise_translated_image_exception(image_id) [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] raise new_exc.with_traceback(exc_trace) [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] result = getattr(controller, method)(*args, **kwargs) [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return self._get(image_id) [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] resp, body = self.http_client.get(url, headers=header) [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return self.request(url, 'GET', **kwargs) [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return self._handle_response(resp) [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] raise exc.from_response(resp, resp.content) [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1077.523901] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1077.523901] env[60722]: INFO nova.compute.manager [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Terminating instance [ 1077.524506] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1077.524634] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1077.525272] env[60722]: DEBUG nova.compute.manager [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1077.525457] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1077.525829] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a848e73d-0bc0-49e1-ab20-03baa873f384 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.528274] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76445f93-34ab-4644-9e9b-fe5555b87fad {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.532140] env[60722]: DEBUG nova.policy [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c8dacf7015bd4aeb8e6b7277a2f0a337', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '03406ae6612c4ceabe8c940d457db3fe', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 1077.539369] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1077.539577] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cd13e497-729c-4e88-96f4-727cfca8e387 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.549542] env[60722]: DEBUG nova.compute.manager [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1077.552663] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1077.553081] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1077.553827] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f890c166-8078-4366-8f8e-993b2c944d62 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.558654] env[60722]: DEBUG oslo_vmware.api [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Waiting for the task: (returnval){ [ 1077.558654] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52eec6ec-4587-c198-a2e2-ccb6a57d5b0a" [ 1077.558654] env[60722]: _type = "Task" [ 1077.558654] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1077.566027] env[60722]: DEBUG oslo_vmware.api [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52eec6ec-4587-c198-a2e2-ccb6a57d5b0a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1077.585741] env[60722]: DEBUG neutronclient.v2_0.client [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60722) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1077.588880] env[60722]: ERROR nova.compute.manager [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Traceback (most recent call last): [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] result = getattr(controller, method)(*args, **kwargs) [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return self._get(image_id) [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] resp, body = self.http_client.get(url, headers=header) [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return self.request(url, 'GET', **kwargs) [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return self._handle_response(resp) [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] raise exc.from_response(resp, resp.content) [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] During handling of the above exception, another exception occurred: [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Traceback (most recent call last): [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] self.driver.spawn(context, instance, image_meta, [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] self._fetch_image_if_missing(context, vi) [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] image_fetch(context, vi, tmp_image_ds_loc) [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] images.fetch_image( [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] metadata = IMAGE_API.get(context, image_ref) [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return session.show(context, image_id, [ 1077.588880] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] _reraise_translated_image_exception(image_id) [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] raise new_exc.with_traceback(exc_trace) [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] result = getattr(controller, method)(*args, **kwargs) [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return self._get(image_id) [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] resp, body = self.http_client.get(url, headers=header) [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return self.request(url, 'GET', **kwargs) [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return self._handle_response(resp) [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] raise exc.from_response(resp, resp.content) [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] During handling of the above exception, another exception occurred: [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Traceback (most recent call last): [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] self._build_and_run_instance(context, instance, image, [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] with excutils.save_and_reraise_exception(): [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] self.force_reraise() [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] raise self.value [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] with self.rt.instance_claim(context, instance, node, allocs, [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] self.abort() [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] self.tracker.abort_instance_claim(self.context, self.instance, [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1077.590356] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return f(*args, **kwargs) [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] self._unset_instance_host_and_node(instance) [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] instance.save() [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] updates, result = self.indirection_api.object_action( [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return cctxt.call(context, 'object_action', objinst=objinst, [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] result = self.transport._send( [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return self._driver.send(target, ctxt, message, [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] raise result [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] nova.exception_Remote.InstanceNotFound_Remote: Instance eae8d9ce-9fe3-411e-9fd8-05920fb0af04 could not be found. [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Traceback (most recent call last): [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return getattr(target, method)(*args, **kwargs) [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return fn(self, *args, **kwargs) [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] old_ref, inst_ref = db.instance_update_and_get_original( [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return f(*args, **kwargs) [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] with excutils.save_and_reraise_exception() as ectxt: [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] self.force_reraise() [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] raise self.value [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return f(*args, **kwargs) [ 1077.591471] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return f(context, *args, **kwargs) [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] raise exception.InstanceNotFound(instance_id=uuid) [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] nova.exception.InstanceNotFound: Instance eae8d9ce-9fe3-411e-9fd8-05920fb0af04 could not be found. [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] During handling of the above exception, another exception occurred: [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Traceback (most recent call last): [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] ret = obj(*args, **kwargs) [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] exception_handler_v20(status_code, error_body) [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] raise client_exc(message=error_message, [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Neutron server returns request_ids: ['req-814bed19-8f69-4988-9883-0c73bc989616'] [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] During handling of the above exception, another exception occurred: [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Traceback (most recent call last): [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] self._deallocate_network(context, instance, requested_networks) [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] self.network_api.deallocate_for_instance( [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] data = neutron.list_ports(**search_opts) [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] ret = obj(*args, **kwargs) [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return self.list('ports', self.ports_path, retrieve_all, [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] ret = obj(*args, **kwargs) [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1077.592882] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] for r in self._pagination(collection, path, **params): [ 1077.594119] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1077.594119] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] res = self.get(path, params=params) [ 1077.594119] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1077.594119] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] ret = obj(*args, **kwargs) [ 1077.594119] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1077.594119] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return self.retry_request("GET", action, body=body, [ 1077.594119] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1077.594119] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] ret = obj(*args, **kwargs) [ 1077.594119] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1077.594119] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] return self.do_request(method, action, body=body, [ 1077.594119] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1077.594119] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] ret = obj(*args, **kwargs) [ 1077.594119] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1077.594119] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] self._handle_fault_response(status_code, replybody, resp) [ 1077.594119] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1077.594119] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] raise exception.Unauthorized() [ 1077.594119] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] nova.exception.Unauthorized: Not authorized. [ 1077.594119] env[60722]: ERROR nova.compute.manager [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] [ 1077.601629] env[60722]: DEBUG nova.virt.hardware [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1077.601852] env[60722]: DEBUG nova.virt.hardware [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1077.602011] env[60722]: DEBUG nova.virt.hardware [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1077.602196] env[60722]: DEBUG nova.virt.hardware [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1077.602337] env[60722]: DEBUG nova.virt.hardware [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1077.602496] env[60722]: DEBUG nova.virt.hardware [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1077.602700] env[60722]: DEBUG nova.virt.hardware [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1077.602900] env[60722]: DEBUG nova.virt.hardware [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1077.603103] env[60722]: DEBUG nova.virt.hardware [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1077.603277] env[60722]: DEBUG nova.virt.hardware [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1077.603449] env[60722]: DEBUG nova.virt.hardware [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1077.604327] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e27e26a-2fb5-46ef-8f09-3d814a159435 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.617833] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87bb4ad0-f6a1-4acd-ace1-47c0140cd620 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.622275] env[60722]: DEBUG oslo_concurrency.lockutils [None req-37dfc1fd-1147-4dbc-9692-b7df8b5c697a tempest-ServersTestJSON-1106866518 tempest-ServersTestJSON-1106866518-project-member] Lock "eae8d9ce-9fe3-411e-9fd8-05920fb0af04" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 410.879s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1077.639299] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1077.639492] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1077.639653] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Deleting the datastore file [datastore1] 4e66f1dc-18c6-4d64-9bbe-9b061e795a65 {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1077.639885] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ec74a82e-447e-40fe-a2a3-3dbdd4180ab7 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1077.646272] env[60722]: DEBUG oslo_vmware.api [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Waiting for the task: (returnval){ [ 1077.646272] env[60722]: value = "task-565207" [ 1077.646272] env[60722]: _type = "Task" [ 1077.646272] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1077.654019] env[60722]: DEBUG oslo_vmware.api [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Task: {'id': task-565207, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1077.869694] env[60722]: DEBUG nova.network.neutron [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Successfully created port: 3ad98f70-e77d-4b23-b3a1-dd3a01a74cc8 {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1078.068908] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1078.069160] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Creating directory with path [datastore1] vmware_temp/ff429408-affc-456c-94ac-fec5acb9c6d9/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1078.069374] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f319e16a-8935-4c72-9b4d-642fe67b7cb6 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.083809] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Created directory with path [datastore1] vmware_temp/ff429408-affc-456c-94ac-fec5acb9c6d9/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1078.084011] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Fetch image to [datastore1] vmware_temp/ff429408-affc-456c-94ac-fec5acb9c6d9/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1078.084193] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/ff429408-affc-456c-94ac-fec5acb9c6d9/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1078.085223] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3b4ce9e-ffef-4f95-ab9f-6be041990569 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.091803] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72c73ffc-d679-4106-9a92-f029df01b634 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.102177] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97b1eaa0-4182-41e5-9929-337975509954 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.132612] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc524d91-9600-45af-a964-9983c74c89e8 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.138385] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7724548f-4cf0-4f91-8ea1-21370b3c04a5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.155815] env[60722]: DEBUG oslo_vmware.api [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Task: {'id': task-565207, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076344} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1078.157115] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1078.157306] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1078.157473] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1078.157636] env[60722]: INFO nova.compute.manager [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1078.159358] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1078.161455] env[60722]: DEBUG nova.compute.claims [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1078.161612] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1078.161810] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1078.186658] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1078.187401] env[60722]: DEBUG nova.compute.utils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Instance 4e66f1dc-18c6-4d64-9bbe-9b061e795a65 could not be found. {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1078.188781] env[60722]: DEBUG nova.compute.manager [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Instance disappeared during build. {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1078.188942] env[60722]: DEBUG nova.compute.manager [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1078.189110] env[60722]: DEBUG nova.compute.manager [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1078.189266] env[60722]: DEBUG nova.compute.manager [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1078.189418] env[60722]: DEBUG nova.network.neutron [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1078.215638] env[60722]: DEBUG neutronclient.v2_0.client [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60722) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1078.217135] env[60722]: ERROR nova.compute.manager [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Traceback (most recent call last): [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] result = getattr(controller, method)(*args, **kwargs) [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return self._get(image_id) [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] resp, body = self.http_client.get(url, headers=header) [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return self.request(url, 'GET', **kwargs) [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return self._handle_response(resp) [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] raise exc.from_response(resp, resp.content) [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] During handling of the above exception, another exception occurred: [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Traceback (most recent call last): [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] self.driver.spawn(context, instance, image_meta, [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] self._fetch_image_if_missing(context, vi) [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] image_fetch(context, vi, tmp_image_ds_loc) [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] images.fetch_image( [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] metadata = IMAGE_API.get(context, image_ref) [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return session.show(context, image_id, [ 1078.217135] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] _reraise_translated_image_exception(image_id) [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] raise new_exc.with_traceback(exc_trace) [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] result = getattr(controller, method)(*args, **kwargs) [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return self._get(image_id) [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] resp, body = self.http_client.get(url, headers=header) [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return self.request(url, 'GET', **kwargs) [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return self._handle_response(resp) [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] raise exc.from_response(resp, resp.content) [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] During handling of the above exception, another exception occurred: [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Traceback (most recent call last): [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] self._build_and_run_instance(context, instance, image, [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] with excutils.save_and_reraise_exception(): [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] self.force_reraise() [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] raise self.value [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] with self.rt.instance_claim(context, instance, node, allocs, [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] self.abort() [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] self.tracker.abort_instance_claim(self.context, self.instance, [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1078.218217] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return f(*args, **kwargs) [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] self._unset_instance_host_and_node(instance) [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] instance.save() [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] updates, result = self.indirection_api.object_action( [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return cctxt.call(context, 'object_action', objinst=objinst, [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] result = self.transport._send( [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return self._driver.send(target, ctxt, message, [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] raise result [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] nova.exception_Remote.InstanceNotFound_Remote: Instance 4e66f1dc-18c6-4d64-9bbe-9b061e795a65 could not be found. [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Traceback (most recent call last): [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return getattr(target, method)(*args, **kwargs) [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return fn(self, *args, **kwargs) [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] old_ref, inst_ref = db.instance_update_and_get_original( [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return f(*args, **kwargs) [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] with excutils.save_and_reraise_exception() as ectxt: [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] self.force_reraise() [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] raise self.value [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return f(*args, **kwargs) [ 1078.219324] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return f(context, *args, **kwargs) [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] raise exception.InstanceNotFound(instance_id=uuid) [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] nova.exception.InstanceNotFound: Instance 4e66f1dc-18c6-4d64-9bbe-9b061e795a65 could not be found. [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] During handling of the above exception, another exception occurred: [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Traceback (most recent call last): [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] ret = obj(*args, **kwargs) [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] exception_handler_v20(status_code, error_body) [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] raise client_exc(message=error_message, [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Neutron server returns request_ids: ['req-2db36e91-b2e7-4429-869e-1a4d7022dfc2'] [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] During handling of the above exception, another exception occurred: [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Traceback (most recent call last): [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] self._deallocate_network(context, instance, requested_networks) [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] self.network_api.deallocate_for_instance( [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] data = neutron.list_ports(**search_opts) [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] ret = obj(*args, **kwargs) [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return self.list('ports', self.ports_path, retrieve_all, [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] ret = obj(*args, **kwargs) [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1078.220507] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] for r in self._pagination(collection, path, **params): [ 1078.221810] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1078.221810] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] res = self.get(path, params=params) [ 1078.221810] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1078.221810] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] ret = obj(*args, **kwargs) [ 1078.221810] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1078.221810] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return self.retry_request("GET", action, body=body, [ 1078.221810] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1078.221810] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] ret = obj(*args, **kwargs) [ 1078.221810] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1078.221810] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] return self.do_request(method, action, body=body, [ 1078.221810] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1078.221810] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] ret = obj(*args, **kwargs) [ 1078.221810] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1078.221810] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] self._handle_fault_response(status_code, replybody, resp) [ 1078.221810] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1078.221810] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] raise exception.Unauthorized() [ 1078.221810] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] nova.exception.Unauthorized: Not authorized. [ 1078.221810] env[60722]: ERROR nova.compute.manager [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] [ 1078.241669] env[60722]: DEBUG oslo_concurrency.lockutils [None req-089ae6ee-1166-424e-8f63-7fdc38e1f71e tempest-ServersTestMultiNic-1186046946 tempest-ServersTestMultiNic-1186046946-project-member] Lock "4e66f1dc-18c6-4d64-9bbe-9b061e795a65" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 400.874s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1078.276175] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1078.277037] env[60722]: ERROR nova.compute.manager [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] Traceback (most recent call last): [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] result = getattr(controller, method)(*args, **kwargs) [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return self._get(image_id) [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] resp, body = self.http_client.get(url, headers=header) [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return self.request(url, 'GET', **kwargs) [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return self._handle_response(resp) [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] raise exc.from_response(resp, resp.content) [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] During handling of the above exception, another exception occurred: [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] Traceback (most recent call last): [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] yield resources [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] self.driver.spawn(context, instance, image_meta, [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] self._fetch_image_if_missing(context, vi) [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] image_fetch(context, vi, tmp_image_ds_loc) [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] images.fetch_image( [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1078.277037] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] metadata = IMAGE_API.get(context, image_ref) [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return session.show(context, image_id, [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] _reraise_translated_image_exception(image_id) [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] raise new_exc.with_traceback(exc_trace) [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] result = getattr(controller, method)(*args, **kwargs) [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return self._get(image_id) [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] resp, body = self.http_client.get(url, headers=header) [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return self.request(url, 'GET', **kwargs) [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return self._handle_response(resp) [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] raise exc.from_response(resp, resp.content) [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1078.277983] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.277983] env[60722]: INFO nova.compute.manager [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Terminating instance [ 1078.279826] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1078.280115] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1078.280762] env[60722]: DEBUG nova.compute.manager [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1078.281036] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1078.281301] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ce914769-ed9e-4512-9467-410699931b27 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.284484] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3e7ee76-b760-4097-b126-b414517e7bf8 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.291406] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1078.291697] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0f41160f-08e6-4b41-80d1-5c14b492890d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.294543] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1078.294782] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1078.295775] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-729b5413-ecc5-42bd-8ffa-02a854e1f5e2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.301109] env[60722]: DEBUG oslo_vmware.api [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Waiting for the task: (returnval){ [ 1078.301109] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52cf82b0-f6be-4b77-c18f-8a18e86f2613" [ 1078.301109] env[60722]: _type = "Task" [ 1078.301109] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1078.308158] env[60722]: DEBUG oslo_vmware.api [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52cf82b0-f6be-4b77-c18f-8a18e86f2613, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1078.350848] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1078.351581] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1078.351852] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Deleting the datastore file [datastore1] 22463917-2185-42f7-87b7-2b720be45c22 {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1078.352150] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-754b2930-0c11-479f-a9fb-582882eeae4b {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.358652] env[60722]: DEBUG oslo_vmware.api [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Waiting for the task: (returnval){ [ 1078.358652] env[60722]: value = "task-565209" [ 1078.358652] env[60722]: _type = "Task" [ 1078.358652] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1078.366146] env[60722]: DEBUG oslo_vmware.api [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Task: {'id': task-565209, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1078.387259] env[60722]: DEBUG nova.network.neutron [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Successfully updated port: 3ad98f70-e77d-4b23-b3a1-dd3a01a74cc8 {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1078.396034] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquiring lock "refresh_cache-2d58b057-fec8-4c3c-bf83-452d27abfd38" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1078.396210] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquired lock "refresh_cache-2d58b057-fec8-4c3c-bf83-452d27abfd38" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1078.396368] env[60722]: DEBUG nova.network.neutron [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1078.433074] env[60722]: DEBUG nova.network.neutron [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1078.577241] env[60722]: DEBUG nova.network.neutron [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Updating instance_info_cache with network_info: [{"id": "3ad98f70-e77d-4b23-b3a1-dd3a01a74cc8", "address": "fa:16:3e:60:71:ee", "network": {"id": "d8ffbf62-b735-4b9e-b07a-14cf3426b943", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-127230270-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "03406ae6612c4ceabe8c940d457db3fe", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e30245c5-78f5-48e6-b504-c6c21f5a9b45", "external-id": "nsx-vlan-transportzone-409", "segmentation_id": 409, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3ad98f70-e7", "ovs_interfaceid": "3ad98f70-e77d-4b23-b3a1-dd3a01a74cc8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1078.589180] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Releasing lock "refresh_cache-2d58b057-fec8-4c3c-bf83-452d27abfd38" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1078.589461] env[60722]: DEBUG nova.compute.manager [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Instance network_info: |[{"id": "3ad98f70-e77d-4b23-b3a1-dd3a01a74cc8", "address": "fa:16:3e:60:71:ee", "network": {"id": "d8ffbf62-b735-4b9e-b07a-14cf3426b943", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-127230270-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "03406ae6612c4ceabe8c940d457db3fe", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e30245c5-78f5-48e6-b504-c6c21f5a9b45", "external-id": "nsx-vlan-transportzone-409", "segmentation_id": 409, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3ad98f70-e7", "ovs_interfaceid": "3ad98f70-e77d-4b23-b3a1-dd3a01a74cc8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1078.589887] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:60:71:ee', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e30245c5-78f5-48e6-b504-c6c21f5a9b45', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3ad98f70-e77d-4b23-b3a1-dd3a01a74cc8', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1078.597908] env[60722]: DEBUG oslo.service.loopingcall [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1078.598348] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1078.598555] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9eff2514-b4c1-4d86-a53f-93db19860af7 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.618631] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1078.618631] env[60722]: value = "task-565210" [ 1078.618631] env[60722]: _type = "Task" [ 1078.618631] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1078.626336] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565210, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1078.810882] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1078.811150] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Creating directory with path [datastore1] vmware_temp/8cb7afb0-1285-4b22-a01e-3672272d741b/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1078.811376] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ded23d3a-4a2f-49b0-8f06-71a83827ebe2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.822141] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Created directory with path [datastore1] vmware_temp/8cb7afb0-1285-4b22-a01e-3672272d741b/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1078.822331] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Fetch image to [datastore1] vmware_temp/8cb7afb0-1285-4b22-a01e-3672272d741b/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1078.822599] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/8cb7afb0-1285-4b22-a01e-3672272d741b/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1078.823294] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd998712-5bb8-47c9-bb73-bd8073a11b1d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.829774] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5aef11b9-89b9-4884-a9c9-e315800a9121 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.839601] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61d42e58-5eb6-4a38-b230-c8d4ab0f1133 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.875277] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a760199-16a1-46ac-9a05-7118758f3c77 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.882246] env[60722]: DEBUG oslo_vmware.api [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Task: {'id': task-565209, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.083609} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1078.883623] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1078.883802] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1078.883966] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1078.884189] env[60722]: INFO nova.compute.manager [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1078.885835] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9a715595-3a47-421a-ac5b-9ead87572e5a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1078.887652] env[60722]: DEBUG nova.compute.claims [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1078.887820] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1078.888041] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1078.906604] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1078.912352] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1078.913024] env[60722]: DEBUG nova.compute.utils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Instance 22463917-2185-42f7-87b7-2b720be45c22 could not be found. {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1078.914988] env[60722]: DEBUG nova.compute.manager [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Instance disappeared during build. {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1078.915188] env[60722]: DEBUG nova.compute.manager [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1078.915316] env[60722]: DEBUG nova.compute.manager [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1078.915477] env[60722]: DEBUG nova.compute.manager [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1078.915628] env[60722]: DEBUG nova.network.neutron [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1078.940270] env[60722]: DEBUG neutronclient.v2_0.client [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60722) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1078.941817] env[60722]: ERROR nova.compute.manager [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] Traceback (most recent call last): [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] result = getattr(controller, method)(*args, **kwargs) [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return self._get(image_id) [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] resp, body = self.http_client.get(url, headers=header) [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return self.request(url, 'GET', **kwargs) [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return self._handle_response(resp) [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] raise exc.from_response(resp, resp.content) [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] During handling of the above exception, another exception occurred: [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] Traceback (most recent call last): [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] self.driver.spawn(context, instance, image_meta, [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] self._fetch_image_if_missing(context, vi) [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] image_fetch(context, vi, tmp_image_ds_loc) [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] images.fetch_image( [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] metadata = IMAGE_API.get(context, image_ref) [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return session.show(context, image_id, [ 1078.941817] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] _reraise_translated_image_exception(image_id) [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] raise new_exc.with_traceback(exc_trace) [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] result = getattr(controller, method)(*args, **kwargs) [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return self._get(image_id) [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] resp, body = self.http_client.get(url, headers=header) [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return self.request(url, 'GET', **kwargs) [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return self._handle_response(resp) [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] raise exc.from_response(resp, resp.content) [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] During handling of the above exception, another exception occurred: [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] Traceback (most recent call last): [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] self._build_and_run_instance(context, instance, image, [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] with excutils.save_and_reraise_exception(): [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] self.force_reraise() [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] raise self.value [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] with self.rt.instance_claim(context, instance, node, allocs, [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] self.abort() [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] self.tracker.abort_instance_claim(self.context, self.instance, [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1078.942902] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return f(*args, **kwargs) [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] self._unset_instance_host_and_node(instance) [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] instance.save() [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] updates, result = self.indirection_api.object_action( [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return cctxt.call(context, 'object_action', objinst=objinst, [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] result = self.transport._send( [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return self._driver.send(target, ctxt, message, [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] raise result [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] nova.exception_Remote.InstanceNotFound_Remote: Instance 22463917-2185-42f7-87b7-2b720be45c22 could not be found. [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] Traceback (most recent call last): [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return getattr(target, method)(*args, **kwargs) [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return fn(self, *args, **kwargs) [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] old_ref, inst_ref = db.instance_update_and_get_original( [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return f(*args, **kwargs) [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] with excutils.save_and_reraise_exception() as ectxt: [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] self.force_reraise() [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] raise self.value [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return f(*args, **kwargs) [ 1078.944225] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return f(context, *args, **kwargs) [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] raise exception.InstanceNotFound(instance_id=uuid) [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] nova.exception.InstanceNotFound: Instance 22463917-2185-42f7-87b7-2b720be45c22 could not be found. [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] During handling of the above exception, another exception occurred: [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] Traceback (most recent call last): [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] ret = obj(*args, **kwargs) [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] exception_handler_v20(status_code, error_body) [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] raise client_exc(message=error_message, [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] Neutron server returns request_ids: ['req-fdb6331f-5bec-42ec-8a26-25f3b7afa73e'] [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] During handling of the above exception, another exception occurred: [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] Traceback (most recent call last): [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] self._deallocate_network(context, instance, requested_networks) [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] self.network_api.deallocate_for_instance( [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] data = neutron.list_ports(**search_opts) [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] ret = obj(*args, **kwargs) [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return self.list('ports', self.ports_path, retrieve_all, [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] ret = obj(*args, **kwargs) [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1078.945537] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] for r in self._pagination(collection, path, **params): [ 1078.946658] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1078.946658] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] res = self.get(path, params=params) [ 1078.946658] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1078.946658] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] ret = obj(*args, **kwargs) [ 1078.946658] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1078.946658] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return self.retry_request("GET", action, body=body, [ 1078.946658] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1078.946658] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] ret = obj(*args, **kwargs) [ 1078.946658] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1078.946658] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] return self.do_request(method, action, body=body, [ 1078.946658] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1078.946658] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] ret = obj(*args, **kwargs) [ 1078.946658] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1078.946658] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] self._handle_fault_response(status_code, replybody, resp) [ 1078.946658] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1078.946658] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] raise exception.Unauthorized() [ 1078.946658] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] nova.exception.Unauthorized: Not authorized. [ 1078.946658] env[60722]: ERROR nova.compute.manager [instance: 22463917-2185-42f7-87b7-2b720be45c22] [ 1078.963234] env[60722]: DEBUG oslo_concurrency.lockutils [None req-2a64d9f5-8fa7-410d-b92d-1185a1cf3e96 tempest-DeleteServersTestJSON-523829062 tempest-DeleteServersTestJSON-523829062-project-member] Lock "22463917-2185-42f7-87b7-2b720be45c22" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 397.552s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1079.001924] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1079.002674] env[60722]: ERROR nova.compute.manager [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Traceback (most recent call last): [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] result = getattr(controller, method)(*args, **kwargs) [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return self._get(image_id) [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] resp, body = self.http_client.get(url, headers=header) [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return self.request(url, 'GET', **kwargs) [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return self._handle_response(resp) [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] raise exc.from_response(resp, resp.content) [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] During handling of the above exception, another exception occurred: [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Traceback (most recent call last): [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] yield resources [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] self.driver.spawn(context, instance, image_meta, [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] self._fetch_image_if_missing(context, vi) [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] image_fetch(context, vi, tmp_image_ds_loc) [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] images.fetch_image( [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1079.002674] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] metadata = IMAGE_API.get(context, image_ref) [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return session.show(context, image_id, [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] _reraise_translated_image_exception(image_id) [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] raise new_exc.with_traceback(exc_trace) [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] result = getattr(controller, method)(*args, **kwargs) [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return self._get(image_id) [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] resp, body = self.http_client.get(url, headers=header) [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return self.request(url, 'GET', **kwargs) [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return self._handle_response(resp) [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] raise exc.from_response(resp, resp.content) [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1079.003758] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.003758] env[60722]: INFO nova.compute.manager [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Terminating instance [ 1079.004575] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1079.004655] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1079.005258] env[60722]: DEBUG nova.compute.manager [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1079.005435] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1079.005646] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b6b28e1b-8b59-4646-83c3-06ad746eecf2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1079.008521] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3103d952-6d8b-4b32-b922-4d02a87c8176 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1079.015356] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1079.015550] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e2356afe-2144-4574-a0a0-6e0d0db511d0 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1079.017646] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1079.017808] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1079.018732] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7d2a4a2b-58b6-4adb-99ee-9fb060ace8f7 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1079.023428] env[60722]: DEBUG oslo_vmware.api [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Waiting for the task: (returnval){ [ 1079.023428] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]5232c70b-5ac4-1ef2-8ec0-75294763df69" [ 1079.023428] env[60722]: _type = "Task" [ 1079.023428] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1079.030115] env[60722]: DEBUG oslo_vmware.api [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]5232c70b-5ac4-1ef2-8ec0-75294763df69, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1079.063505] env[60722]: DEBUG nova.compute.manager [req-96862c6e-a89c-49a5-af8f-63d9a2470d09 req-fea076d1-dad7-46c4-8624-554819fdab79 service nova] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Received event network-vif-plugged-3ad98f70-e77d-4b23-b3a1-dd3a01a74cc8 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1079.063723] env[60722]: DEBUG oslo_concurrency.lockutils [req-96862c6e-a89c-49a5-af8f-63d9a2470d09 req-fea076d1-dad7-46c4-8624-554819fdab79 service nova] Acquiring lock "2d58b057-fec8-4c3c-bf83-452d27abfd38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1079.063986] env[60722]: DEBUG oslo_concurrency.lockutils [req-96862c6e-a89c-49a5-af8f-63d9a2470d09 req-fea076d1-dad7-46c4-8624-554819fdab79 service nova] Lock "2d58b057-fec8-4c3c-bf83-452d27abfd38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1079.064150] env[60722]: DEBUG oslo_concurrency.lockutils [req-96862c6e-a89c-49a5-af8f-63d9a2470d09 req-fea076d1-dad7-46c4-8624-554819fdab79 service nova] Lock "2d58b057-fec8-4c3c-bf83-452d27abfd38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1079.064307] env[60722]: DEBUG nova.compute.manager [req-96862c6e-a89c-49a5-af8f-63d9a2470d09 req-fea076d1-dad7-46c4-8624-554819fdab79 service nova] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] No waiting events found dispatching network-vif-plugged-3ad98f70-e77d-4b23-b3a1-dd3a01a74cc8 {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1079.064461] env[60722]: WARNING nova.compute.manager [req-96862c6e-a89c-49a5-af8f-63d9a2470d09 req-fea076d1-dad7-46c4-8624-554819fdab79 service nova] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Received unexpected event network-vif-plugged-3ad98f70-e77d-4b23-b3a1-dd3a01a74cc8 for instance with vm_state building and task_state spawning. [ 1079.064612] env[60722]: DEBUG nova.compute.manager [req-96862c6e-a89c-49a5-af8f-63d9a2470d09 req-fea076d1-dad7-46c4-8624-554819fdab79 service nova] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Received event network-changed-3ad98f70-e77d-4b23-b3a1-dd3a01a74cc8 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1079.064755] env[60722]: DEBUG nova.compute.manager [req-96862c6e-a89c-49a5-af8f-63d9a2470d09 req-fea076d1-dad7-46c4-8624-554819fdab79 service nova] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Refreshing instance network info cache due to event network-changed-3ad98f70-e77d-4b23-b3a1-dd3a01a74cc8. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 1079.064927] env[60722]: DEBUG oslo_concurrency.lockutils [req-96862c6e-a89c-49a5-af8f-63d9a2470d09 req-fea076d1-dad7-46c4-8624-554819fdab79 service nova] Acquiring lock "refresh_cache-2d58b057-fec8-4c3c-bf83-452d27abfd38" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1079.065072] env[60722]: DEBUG oslo_concurrency.lockutils [req-96862c6e-a89c-49a5-af8f-63d9a2470d09 req-fea076d1-dad7-46c4-8624-554819fdab79 service nova] Acquired lock "refresh_cache-2d58b057-fec8-4c3c-bf83-452d27abfd38" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1079.065222] env[60722]: DEBUG nova.network.neutron [req-96862c6e-a89c-49a5-af8f-63d9a2470d09 req-fea076d1-dad7-46c4-8624-554819fdab79 service nova] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Refreshing network info cache for port 3ad98f70-e77d-4b23-b3a1-dd3a01a74cc8 {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1079.079259] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1079.079450] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1079.079622] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Deleting the datastore file [datastore1] 1c4b8597-88ec-4e79-a749-f802803a5ffe {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1079.079842] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c40058f0-394c-4fbe-93e7-397502a68477 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1079.086111] env[60722]: DEBUG oslo_vmware.api [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Waiting for the task: (returnval){ [ 1079.086111] env[60722]: value = "task-565212" [ 1079.086111] env[60722]: _type = "Task" [ 1079.086111] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1079.093309] env[60722]: DEBUG oslo_vmware.api [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Task: {'id': task-565212, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1079.126840] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565210, 'name': CreateVM_Task, 'duration_secs': 0.279945} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1079.126984] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1079.127590] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1079.127742] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1079.128061] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1079.128285] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-42bad1da-7a68-4e1b-a93b-4f727147c9a4 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1079.132150] env[60722]: DEBUG oslo_vmware.api [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Waiting for the task: (returnval){ [ 1079.132150] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]521752cc-bac5-bd43-7bbc-526318e0e9d1" [ 1079.132150] env[60722]: _type = "Task" [ 1079.132150] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1079.139547] env[60722]: DEBUG oslo_vmware.api [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]521752cc-bac5-bd43-7bbc-526318e0e9d1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1079.286141] env[60722]: DEBUG nova.network.neutron [req-96862c6e-a89c-49a5-af8f-63d9a2470d09 req-fea076d1-dad7-46c4-8624-554819fdab79 service nova] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Updated VIF entry in instance network info cache for port 3ad98f70-e77d-4b23-b3a1-dd3a01a74cc8. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1079.286481] env[60722]: DEBUG nova.network.neutron [req-96862c6e-a89c-49a5-af8f-63d9a2470d09 req-fea076d1-dad7-46c4-8624-554819fdab79 service nova] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Updating instance_info_cache with network_info: [{"id": "3ad98f70-e77d-4b23-b3a1-dd3a01a74cc8", "address": "fa:16:3e:60:71:ee", "network": {"id": "d8ffbf62-b735-4b9e-b07a-14cf3426b943", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-127230270-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "03406ae6612c4ceabe8c940d457db3fe", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e30245c5-78f5-48e6-b504-c6c21f5a9b45", "external-id": "nsx-vlan-transportzone-409", "segmentation_id": 409, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3ad98f70-e7", "ovs_interfaceid": "3ad98f70-e77d-4b23-b3a1-dd3a01a74cc8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1079.295018] env[60722]: DEBUG oslo_concurrency.lockutils [req-96862c6e-a89c-49a5-af8f-63d9a2470d09 req-fea076d1-dad7-46c4-8624-554819fdab79 service nova] Releasing lock "refresh_cache-2d58b057-fec8-4c3c-bf83-452d27abfd38" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1079.534467] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1079.534701] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Creating directory with path [datastore1] vmware_temp/4b7fe13f-3830-445f-8d0a-96ed083f6cc8/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1079.534910] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b92634ae-dde7-45a4-875f-5ff2c4f561d6 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1079.546774] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Created directory with path [datastore1] vmware_temp/4b7fe13f-3830-445f-8d0a-96ed083f6cc8/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1079.546920] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Fetch image to [datastore1] vmware_temp/4b7fe13f-3830-445f-8d0a-96ed083f6cc8/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1079.547083] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/4b7fe13f-3830-445f-8d0a-96ed083f6cc8/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1079.547861] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fedcd8c-58d5-49fa-b02e-01e2a507ec0e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1079.554378] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-052b8ea2-4122-426c-942b-4184b388935e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1079.562909] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed34c02e-64dc-4075-983e-d36553ff5dd5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1079.594635] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ee8d137-a139-4f91-8929-e80dd3a36c9a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1079.602477] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-03b76dbd-bd3c-46c0-8982-dd6c93c5c7e9 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1079.604059] env[60722]: DEBUG oslo_vmware.api [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Task: {'id': task-565212, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063132} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1079.604281] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1079.604449] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1079.604610] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1079.604774] env[60722]: INFO nova.compute.manager [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1079.606755] env[60722]: DEBUG nova.compute.claims [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1079.606957] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1079.607188] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1079.626715] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1079.631360] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1079.631967] env[60722]: DEBUG nova.compute.utils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Instance 1c4b8597-88ec-4e79-a749-f802803a5ffe could not be found. {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1079.636454] env[60722]: DEBUG nova.compute.manager [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Instance disappeared during build. {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1079.636615] env[60722]: DEBUG nova.compute.manager [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1079.636771] env[60722]: DEBUG nova.compute.manager [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1079.636935] env[60722]: DEBUG nova.compute.manager [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1079.637110] env[60722]: DEBUG nova.network.neutron [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1079.643514] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1079.643731] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1079.643942] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1079.661600] env[60722]: DEBUG neutronclient.v2_0.client [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60722) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1079.663114] env[60722]: ERROR nova.compute.manager [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Traceback (most recent call last): [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] result = getattr(controller, method)(*args, **kwargs) [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return self._get(image_id) [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] resp, body = self.http_client.get(url, headers=header) [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return self.request(url, 'GET', **kwargs) [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return self._handle_response(resp) [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] raise exc.from_response(resp, resp.content) [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] During handling of the above exception, another exception occurred: [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Traceback (most recent call last): [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] self.driver.spawn(context, instance, image_meta, [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] self._fetch_image_if_missing(context, vi) [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] image_fetch(context, vi, tmp_image_ds_loc) [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] images.fetch_image( [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] metadata = IMAGE_API.get(context, image_ref) [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return session.show(context, image_id, [ 1079.663114] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] _reraise_translated_image_exception(image_id) [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] raise new_exc.with_traceback(exc_trace) [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] result = getattr(controller, method)(*args, **kwargs) [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return self._get(image_id) [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] resp, body = self.http_client.get(url, headers=header) [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return self.request(url, 'GET', **kwargs) [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return self._handle_response(resp) [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] raise exc.from_response(resp, resp.content) [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] During handling of the above exception, another exception occurred: [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Traceback (most recent call last): [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] self._build_and_run_instance(context, instance, image, [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] with excutils.save_and_reraise_exception(): [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] self.force_reraise() [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] raise self.value [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] with self.rt.instance_claim(context, instance, node, allocs, [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] self.abort() [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] self.tracker.abort_instance_claim(self.context, self.instance, [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1079.664211] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return f(*args, **kwargs) [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] self._unset_instance_host_and_node(instance) [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] instance.save() [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] updates, result = self.indirection_api.object_action( [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return cctxt.call(context, 'object_action', objinst=objinst, [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] result = self.transport._send( [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return self._driver.send(target, ctxt, message, [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] raise result [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] nova.exception_Remote.InstanceNotFound_Remote: Instance 1c4b8597-88ec-4e79-a749-f802803a5ffe could not be found. [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Traceback (most recent call last): [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return getattr(target, method)(*args, **kwargs) [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return fn(self, *args, **kwargs) [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] old_ref, inst_ref = db.instance_update_and_get_original( [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return f(*args, **kwargs) [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] with excutils.save_and_reraise_exception() as ectxt: [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] self.force_reraise() [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] raise self.value [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return f(*args, **kwargs) [ 1079.665205] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return f(context, *args, **kwargs) [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] raise exception.InstanceNotFound(instance_id=uuid) [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] nova.exception.InstanceNotFound: Instance 1c4b8597-88ec-4e79-a749-f802803a5ffe could not be found. [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] During handling of the above exception, another exception occurred: [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Traceback (most recent call last): [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] ret = obj(*args, **kwargs) [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] exception_handler_v20(status_code, error_body) [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] raise client_exc(message=error_message, [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Neutron server returns request_ids: ['req-a72ded46-d378-4fe2-9b27-240f1e5af48a'] [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] During handling of the above exception, another exception occurred: [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Traceback (most recent call last): [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] self._deallocate_network(context, instance, requested_networks) [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] self.network_api.deallocate_for_instance( [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] data = neutron.list_ports(**search_opts) [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] ret = obj(*args, **kwargs) [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return self.list('ports', self.ports_path, retrieve_all, [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] ret = obj(*args, **kwargs) [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1079.666252] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] for r in self._pagination(collection, path, **params): [ 1079.667648] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1079.667648] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] res = self.get(path, params=params) [ 1079.667648] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1079.667648] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] ret = obj(*args, **kwargs) [ 1079.667648] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1079.667648] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return self.retry_request("GET", action, body=body, [ 1079.667648] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1079.667648] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] ret = obj(*args, **kwargs) [ 1079.667648] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1079.667648] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] return self.do_request(method, action, body=body, [ 1079.667648] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1079.667648] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] ret = obj(*args, **kwargs) [ 1079.667648] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1079.667648] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] self._handle_fault_response(status_code, replybody, resp) [ 1079.667648] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1079.667648] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] raise exception.Unauthorized() [ 1079.667648] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] nova.exception.Unauthorized: Not authorized. [ 1079.667648] env[60722]: ERROR nova.compute.manager [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] [ 1079.686603] env[60722]: DEBUG oslo_concurrency.lockutils [None req-beffa758-1e79-4bcd-97d9-decc4a2d3f90 tempest-AttachInterfacesV270Test-1608388824 tempest-AttachInterfacesV270Test-1608388824-project-member] Lock "1c4b8597-88ec-4e79-a749-f802803a5ffe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 398.022s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1079.718467] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1079.719191] env[60722]: ERROR nova.compute.manager [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Traceback (most recent call last): [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] result = getattr(controller, method)(*args, **kwargs) [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return self._get(image_id) [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] resp, body = self.http_client.get(url, headers=header) [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return self.request(url, 'GET', **kwargs) [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return self._handle_response(resp) [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] raise exc.from_response(resp, resp.content) [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] During handling of the above exception, another exception occurred: [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Traceback (most recent call last): [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] yield resources [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] self.driver.spawn(context, instance, image_meta, [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] self._fetch_image_if_missing(context, vi) [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] image_fetch(context, vi, tmp_image_ds_loc) [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] images.fetch_image( [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1079.719191] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] metadata = IMAGE_API.get(context, image_ref) [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return session.show(context, image_id, [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] _reraise_translated_image_exception(image_id) [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] raise new_exc.with_traceback(exc_trace) [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] result = getattr(controller, method)(*args, **kwargs) [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return self._get(image_id) [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] resp, body = self.http_client.get(url, headers=header) [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return self.request(url, 'GET', **kwargs) [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return self._handle_response(resp) [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] raise exc.from_response(resp, resp.content) [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1079.720372] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1079.720372] env[60722]: INFO nova.compute.manager [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Terminating instance [ 1079.721099] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1079.721139] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1079.721729] env[60722]: DEBUG nova.compute.manager [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1079.721913] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1079.722134] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3d3f9e9b-2935-4ef3-8379-b97b50a50021 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1079.724756] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afae57f0-72cb-4f94-a0b5-a388cb353f29 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1079.731505] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1079.731697] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cf909807-49f2-4bfe-8a7a-026f68d22ec1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1079.733778] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1079.733943] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1079.734831] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2c6eda74-c9e3-4e90-8c44-fd810e22e558 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1079.739214] env[60722]: DEBUG oslo_vmware.api [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Waiting for the task: (returnval){ [ 1079.739214] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]521a3c82-2c4b-3171-6a9b-54e988b9afe3" [ 1079.739214] env[60722]: _type = "Task" [ 1079.739214] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1079.748228] env[60722]: DEBUG oslo_vmware.api [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]521a3c82-2c4b-3171-6a9b-54e988b9afe3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1079.800136] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1079.800286] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1079.800448] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Deleting the datastore file [datastore1] 2786801d-6211-4598-b357-4f0a0ffdd7d1 {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1079.800750] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d2caed69-c7c9-4801-ae0d-0514640145ad {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1079.806655] env[60722]: DEBUG oslo_vmware.api [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Waiting for the task: (returnval){ [ 1079.806655] env[60722]: value = "task-565214" [ 1079.806655] env[60722]: _type = "Task" [ 1079.806655] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1079.814268] env[60722]: DEBUG oslo_vmware.api [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Task: {'id': task-565214, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1079.944505] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1079.944709] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1079.944863] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1079.945020] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60722) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1080.249153] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1080.249350] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Creating directory with path [datastore1] vmware_temp/22eeb202-a128-429a-8008-d923a872b227/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1080.249579] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8c8d6a87-24ae-41e7-83ac-e07578d4f414 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1080.260498] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Created directory with path [datastore1] vmware_temp/22eeb202-a128-429a-8008-d923a872b227/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1080.260684] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Fetch image to [datastore1] vmware_temp/22eeb202-a128-429a-8008-d923a872b227/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1080.260861] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/22eeb202-a128-429a-8008-d923a872b227/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1080.261571] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed7fc43a-4076-4988-896f-6fad12dc2e52 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1080.267851] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16b1b2c9-128f-4913-9001-0713a737263a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1080.276396] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd5e6bc0-cba6-410e-92d4-eafc0e06b3cf {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1080.306357] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aedc16ef-faee-499a-a84b-c4d7207cc5cc {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1080.317067] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-51bec090-4518-46bf-94ae-251712c9e051 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1080.318639] env[60722]: DEBUG oslo_vmware.api [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Task: {'id': task-565214, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064508} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1080.318859] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1080.319040] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1080.319206] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1080.319367] env[60722]: INFO nova.compute.manager [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1080.321773] env[60722]: DEBUG nova.compute.claims [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1080.321938] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1080.322192] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1080.337239] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1080.368893] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.046s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1080.368893] env[60722]: DEBUG nova.compute.utils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Instance 2786801d-6211-4598-b357-4f0a0ffdd7d1 could not be found. {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1080.370157] env[60722]: DEBUG nova.compute.manager [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Instance disappeared during build. {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1080.370320] env[60722]: DEBUG nova.compute.manager [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1080.371372] env[60722]: DEBUG nova.compute.manager [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1080.371372] env[60722]: DEBUG nova.compute.manager [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1080.371372] env[60722]: DEBUG nova.network.neutron [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1080.431486] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1080.432253] env[60722]: ERROR nova.compute.manager [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Traceback (most recent call last): [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] result = getattr(controller, method)(*args, **kwargs) [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return self._get(image_id) [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] resp, body = self.http_client.get(url, headers=header) [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return self.request(url, 'GET', **kwargs) [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return self._handle_response(resp) [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] raise exc.from_response(resp, resp.content) [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] During handling of the above exception, another exception occurred: [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Traceback (most recent call last): [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] yield resources [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] self.driver.spawn(context, instance, image_meta, [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] self._fetch_image_if_missing(context, vi) [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] image_fetch(context, vi, tmp_image_ds_loc) [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] images.fetch_image( [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1080.432253] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] metadata = IMAGE_API.get(context, image_ref) [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return session.show(context, image_id, [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] _reraise_translated_image_exception(image_id) [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] raise new_exc.with_traceback(exc_trace) [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] result = getattr(controller, method)(*args, **kwargs) [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return self._get(image_id) [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] resp, body = self.http_client.get(url, headers=header) [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return self.request(url, 'GET', **kwargs) [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return self._handle_response(resp) [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] raise exc.from_response(resp, resp.content) [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1080.433362] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1080.433362] env[60722]: INFO nova.compute.manager [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Terminating instance [ 1080.434093] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1080.434248] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1080.434842] env[60722]: DEBUG nova.compute.manager [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1080.435044] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1080.435265] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6e98b186-11ff-4058-b0e0-121797eef4c6 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1080.437901] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a2f0703-2116-4404-b73f-29e9951f8c75 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1080.445953] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1080.446131] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6c24db0f-419e-4246-895b-8afd2f07d217 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1080.448263] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1080.448427] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1080.449337] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b15145a1-79d7-4e91-93e3-c23d386af688 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1080.454950] env[60722]: DEBUG oslo_vmware.api [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Waiting for the task: (returnval){ [ 1080.454950] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52b3dba6-abb5-9069-c120-d98c5c535ef9" [ 1080.454950] env[60722]: _type = "Task" [ 1080.454950] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1080.462036] env[60722]: DEBUG oslo_vmware.api [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52b3dba6-abb5-9069-c120-d98c5c535ef9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1080.498533] env[60722]: DEBUG neutronclient.v2_0.client [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60722) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1080.500075] env[60722]: ERROR nova.compute.manager [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Traceback (most recent call last): [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] result = getattr(controller, method)(*args, **kwargs) [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return self._get(image_id) [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] resp, body = self.http_client.get(url, headers=header) [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return self.request(url, 'GET', **kwargs) [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return self._handle_response(resp) [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] raise exc.from_response(resp, resp.content) [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] During handling of the above exception, another exception occurred: [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Traceback (most recent call last): [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] self.driver.spawn(context, instance, image_meta, [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] self._fetch_image_if_missing(context, vi) [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] image_fetch(context, vi, tmp_image_ds_loc) [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] images.fetch_image( [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] metadata = IMAGE_API.get(context, image_ref) [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return session.show(context, image_id, [ 1080.500075] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] _reraise_translated_image_exception(image_id) [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] raise new_exc.with_traceback(exc_trace) [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] result = getattr(controller, method)(*args, **kwargs) [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return self._get(image_id) [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] resp, body = self.http_client.get(url, headers=header) [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return self.request(url, 'GET', **kwargs) [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return self._handle_response(resp) [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] raise exc.from_response(resp, resp.content) [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] During handling of the above exception, another exception occurred: [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Traceback (most recent call last): [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] self._build_and_run_instance(context, instance, image, [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] with excutils.save_and_reraise_exception(): [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] self.force_reraise() [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] raise self.value [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] with self.rt.instance_claim(context, instance, node, allocs, [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] self.abort() [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] self.tracker.abort_instance_claim(self.context, self.instance, [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1080.501599] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return f(*args, **kwargs) [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] self._unset_instance_host_and_node(instance) [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] instance.save() [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] updates, result = self.indirection_api.object_action( [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return cctxt.call(context, 'object_action', objinst=objinst, [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] result = self.transport._send( [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return self._driver.send(target, ctxt, message, [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] raise result [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] nova.exception_Remote.InstanceNotFound_Remote: Instance 2786801d-6211-4598-b357-4f0a0ffdd7d1 could not be found. [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Traceback (most recent call last): [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return getattr(target, method)(*args, **kwargs) [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return fn(self, *args, **kwargs) [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] old_ref, inst_ref = db.instance_update_and_get_original( [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return f(*args, **kwargs) [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] with excutils.save_and_reraise_exception() as ectxt: [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] self.force_reraise() [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] raise self.value [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return f(*args, **kwargs) [ 1080.503395] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return f(context, *args, **kwargs) [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] raise exception.InstanceNotFound(instance_id=uuid) [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] nova.exception.InstanceNotFound: Instance 2786801d-6211-4598-b357-4f0a0ffdd7d1 could not be found. [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] During handling of the above exception, another exception occurred: [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Traceback (most recent call last): [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] ret = obj(*args, **kwargs) [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] exception_handler_v20(status_code, error_body) [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] raise client_exc(message=error_message, [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Neutron server returns request_ids: ['req-0838c3fc-1223-4766-8337-f5596b2c761e'] [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] During handling of the above exception, another exception occurred: [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Traceback (most recent call last): [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] self._deallocate_network(context, instance, requested_networks) [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] self.network_api.deallocate_for_instance( [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] data = neutron.list_ports(**search_opts) [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] ret = obj(*args, **kwargs) [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return self.list('ports', self.ports_path, retrieve_all, [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] ret = obj(*args, **kwargs) [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1080.505601] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] for r in self._pagination(collection, path, **params): [ 1080.507081] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1080.507081] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] res = self.get(path, params=params) [ 1080.507081] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1080.507081] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] ret = obj(*args, **kwargs) [ 1080.507081] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1080.507081] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return self.retry_request("GET", action, body=body, [ 1080.507081] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1080.507081] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] ret = obj(*args, **kwargs) [ 1080.507081] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1080.507081] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] return self.do_request(method, action, body=body, [ 1080.507081] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1080.507081] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] ret = obj(*args, **kwargs) [ 1080.507081] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1080.507081] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] self._handle_fault_response(status_code, replybody, resp) [ 1080.507081] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1080.507081] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] raise exception.Unauthorized() [ 1080.507081] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] nova.exception.Unauthorized: Not authorized. [ 1080.507081] env[60722]: ERROR nova.compute.manager [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] [ 1080.514108] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1080.514315] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1080.514483] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Deleting the datastore file [datastore1] 019db29d-b8e4-4592-b7c4-2c044e2b2a51 {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1080.514715] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4b7057e2-6a80-4a47-9ac3-8cfb084f2efc {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1080.520420] env[60722]: DEBUG oslo_vmware.api [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Waiting for the task: (returnval){ [ 1080.520420] env[60722]: value = "task-565216" [ 1080.520420] env[60722]: _type = "Task" [ 1080.520420] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1080.521304] env[60722]: DEBUG oslo_concurrency.lockutils [None req-6daaddb2-a22c-43bd-a4d1-976fd75f12a1 tempest-ServerGroupTestJSON-446669058 tempest-ServerGroupTestJSON-446669058-project-member] Lock "2786801d-6211-4598-b357-4f0a0ffdd7d1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 395.491s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1080.529804] env[60722]: DEBUG oslo_vmware.api [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Task: {'id': task-565216, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1080.965499] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1080.965780] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Creating directory with path [datastore1] vmware_temp/cf79c76f-8a8f-490d-8faa-ec094a0381c0/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1080.965937] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b7c3c4bb-0050-4ada-a816-264ec9188174 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1080.976700] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Created directory with path [datastore1] vmware_temp/cf79c76f-8a8f-490d-8faa-ec094a0381c0/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1080.976899] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Fetch image to [datastore1] vmware_temp/cf79c76f-8a8f-490d-8faa-ec094a0381c0/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1080.977076] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/cf79c76f-8a8f-490d-8faa-ec094a0381c0/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1080.977760] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8842a627-e4a2-4062-aecf-6d81def64f5f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1080.984246] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c496332-426e-4982-8aee-d800a300ef63 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1080.992781] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcb6e174-d013-4c48-a5ca-ee9f475be3ee {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1081.021990] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79d7fceb-abf1-4644-a5c5-270f40790f7d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1081.032731] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-91a38599-4ad1-4b65-9019-f879e3f2e12c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1081.034377] env[60722]: DEBUG oslo_vmware.api [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Task: {'id': task-565216, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070316} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1081.034588] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1081.034755] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1081.034919] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1081.035099] env[60722]: INFO nova.compute.manager [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1081.037177] env[60722]: DEBUG nova.compute.claims [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1081.037338] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1081.037536] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1081.054528] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1081.061506] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1081.062139] env[60722]: DEBUG nova.compute.utils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Instance 019db29d-b8e4-4592-b7c4-2c044e2b2a51 could not be found. {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1081.063523] env[60722]: DEBUG nova.compute.manager [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Instance disappeared during build. {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1081.063680] env[60722]: DEBUG nova.compute.manager [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1081.063841] env[60722]: DEBUG nova.compute.manager [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1081.064018] env[60722]: DEBUG nova.compute.manager [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1081.064180] env[60722]: DEBUG nova.network.neutron [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1081.097985] env[60722]: DEBUG oslo_vmware.rw_handles [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cf79c76f-8a8f-490d-8faa-ec094a0381c0/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1081.151665] env[60722]: DEBUG oslo_vmware.rw_handles [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Completed reading data from the image iterator. {{(pid=60722) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1081.151823] env[60722]: DEBUG oslo_vmware.rw_handles [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cf79c76f-8a8f-490d-8faa-ec094a0381c0/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1081.196513] env[60722]: DEBUG neutronclient.v2_0.client [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60722) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1081.198032] env[60722]: ERROR nova.compute.manager [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Traceback (most recent call last): [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] result = getattr(controller, method)(*args, **kwargs) [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return self._get(image_id) [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] resp, body = self.http_client.get(url, headers=header) [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return self.request(url, 'GET', **kwargs) [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return self._handle_response(resp) [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] raise exc.from_response(resp, resp.content) [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] During handling of the above exception, another exception occurred: [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Traceback (most recent call last): [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] self.driver.spawn(context, instance, image_meta, [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] self._fetch_image_if_missing(context, vi) [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] image_fetch(context, vi, tmp_image_ds_loc) [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] images.fetch_image( [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] metadata = IMAGE_API.get(context, image_ref) [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return session.show(context, image_id, [ 1081.198032] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] _reraise_translated_image_exception(image_id) [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] raise new_exc.with_traceback(exc_trace) [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] result = getattr(controller, method)(*args, **kwargs) [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return self._get(image_id) [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] resp, body = self.http_client.get(url, headers=header) [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return self.request(url, 'GET', **kwargs) [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return self._handle_response(resp) [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] raise exc.from_response(resp, resp.content) [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] During handling of the above exception, another exception occurred: [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Traceback (most recent call last): [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] self._build_and_run_instance(context, instance, image, [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] with excutils.save_and_reraise_exception(): [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] self.force_reraise() [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] raise self.value [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] with self.rt.instance_claim(context, instance, node, allocs, [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] self.abort() [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] self.tracker.abort_instance_claim(self.context, self.instance, [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1081.199540] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return f(*args, **kwargs) [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] self._unset_instance_host_and_node(instance) [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] instance.save() [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] updates, result = self.indirection_api.object_action( [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return cctxt.call(context, 'object_action', objinst=objinst, [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] result = self.transport._send( [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return self._driver.send(target, ctxt, message, [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] raise result [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] nova.exception_Remote.InstanceNotFound_Remote: Instance 019db29d-b8e4-4592-b7c4-2c044e2b2a51 could not be found. [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Traceback (most recent call last): [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return getattr(target, method)(*args, **kwargs) [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return fn(self, *args, **kwargs) [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] old_ref, inst_ref = db.instance_update_and_get_original( [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return f(*args, **kwargs) [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] with excutils.save_and_reraise_exception() as ectxt: [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] self.force_reraise() [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] raise self.value [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return f(*args, **kwargs) [ 1081.200632] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return f(context, *args, **kwargs) [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] raise exception.InstanceNotFound(instance_id=uuid) [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] nova.exception.InstanceNotFound: Instance 019db29d-b8e4-4592-b7c4-2c044e2b2a51 could not be found. [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] During handling of the above exception, another exception occurred: [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Traceback (most recent call last): [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] ret = obj(*args, **kwargs) [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] exception_handler_v20(status_code, error_body) [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] raise client_exc(message=error_message, [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Neutron server returns request_ids: ['req-2c611bd9-1780-49f9-a7b8-fe34c408acfc'] [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] During handling of the above exception, another exception occurred: [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Traceback (most recent call last): [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] self._deallocate_network(context, instance, requested_networks) [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] self.network_api.deallocate_for_instance( [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] data = neutron.list_ports(**search_opts) [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] ret = obj(*args, **kwargs) [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return self.list('ports', self.ports_path, retrieve_all, [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] ret = obj(*args, **kwargs) [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1081.201802] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] for r in self._pagination(collection, path, **params): [ 1081.203400] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1081.203400] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] res = self.get(path, params=params) [ 1081.203400] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1081.203400] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] ret = obj(*args, **kwargs) [ 1081.203400] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1081.203400] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return self.retry_request("GET", action, body=body, [ 1081.203400] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1081.203400] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] ret = obj(*args, **kwargs) [ 1081.203400] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1081.203400] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] return self.do_request(method, action, body=body, [ 1081.203400] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1081.203400] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] ret = obj(*args, **kwargs) [ 1081.203400] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1081.203400] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] self._handle_fault_response(status_code, replybody, resp) [ 1081.203400] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1081.203400] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] raise exception.Unauthorized() [ 1081.203400] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] nova.exception.Unauthorized: Not authorized. [ 1081.203400] env[60722]: ERROR nova.compute.manager [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] [ 1081.221646] env[60722]: DEBUG oslo_concurrency.lockutils [None req-3f66d66f-fa50-49ac-a6f5-8f9f16b20dd5 tempest-AttachVolumeShelveTestJSON-651595327 tempest-AttachVolumeShelveTestJSON-651595327-project-member] Lock "019db29d-b8e4-4592-b7c4-2c044e2b2a51" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 395.252s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1081.940513] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1082.943782] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1082.944187] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1082.944187] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1083.944858] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1083.954822] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1083.955024] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1083.955198] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1083.955348] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60722) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1083.956361] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ab213e1-08d5-40b6-808b-29c3ad76c1d7 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1083.964855] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35afc0a4-97e5-4ab1-a9f3-4813593f12e0 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1083.978116] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fab3d028-26ec-4af9-9ffb-35fa3c1e8e3e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1083.983971] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b82dab2e-ed80-406e-a0e7-1f52538d18e0 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1084.011675] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181720MB free_disk=100GB free_vcpus=48 pci_devices=None {{(pid=60722) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1084.011808] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1084.011986] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1084.051947] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 8d78f310-a2f2-4073-8371-afc42cc566f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1084.052112] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance 2d58b057-fec8-4c3c-bf83-452d27abfd38 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1084.052284] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1084.052424] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1084.087361] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2c9eb9b-719e-4d0c-8dd4-aec00f87fd64 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1084.094385] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1c479a2-a1d4-48ff-bac2-40be87dc15a8 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1084.124399] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f133bf47-0fbb-4868-8b9a-718de7d6f67f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1084.130802] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-937ad4e9-1365-4504-9a3a-fc4ebd6bfe62 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1084.143130] env[60722]: DEBUG nova.compute.provider_tree [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1084.150925] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1084.163060] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60722) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1084.163240] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.151s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1086.164096] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1086.164487] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Starting heal instance info cache {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1086.164487] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Rebuilding the list of instances to heal {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1086.175670] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1086.175818] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1086.175944] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Didn't find any instances for network info cache update. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1102.545604] env[60722]: DEBUG nova.compute.manager [req-d68827fb-cdae-4d4c-b0e2-8be6753b3ac4 req-f23ca955-40d0-40ef-9c92-9e3e9d0023be service nova] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Received event network-vif-deleted-2d83902a-f312-4c9b-8f37-3857c1c8e091 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1127.540754] env[60722]: DEBUG nova.compute.manager [req-2393d2ac-2db3-46ea-8bac-b71a711b21a8 req-b9302b0b-f0cc-49c9-83e3-181196b0c77d service nova] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Received event network-vif-deleted-3ad98f70-e77d-4b23-b3a1-dd3a01a74cc8 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1131.122986] env[60722]: WARNING oslo_vmware.rw_handles [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1131.122986] env[60722]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1131.122986] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1131.122986] env[60722]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1131.122986] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1131.122986] env[60722]: ERROR oslo_vmware.rw_handles response.begin() [ 1131.122986] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1131.122986] env[60722]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1131.122986] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1131.122986] env[60722]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1131.122986] env[60722]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1131.122986] env[60722]: ERROR oslo_vmware.rw_handles [ 1131.123626] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Downloaded image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to vmware_temp/cf79c76f-8a8f-490d-8faa-ec094a0381c0/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1131.124998] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Caching image {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1131.125266] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Copying Virtual Disk [datastore1] vmware_temp/cf79c76f-8a8f-490d-8faa-ec094a0381c0/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk to [datastore1] vmware_temp/cf79c76f-8a8f-490d-8faa-ec094a0381c0/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk {{(pid=60722) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1131.125573] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a778c688-9b12-4399-81b6-110fc717aa88 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1131.132883] env[60722]: DEBUG oslo_vmware.api [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Waiting for the task: (returnval){ [ 1131.132883] env[60722]: value = "task-565217" [ 1131.132883] env[60722]: _type = "Task" [ 1131.132883] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1131.141485] env[60722]: DEBUG oslo_vmware.api [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Task: {'id': task-565217, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1131.644324] env[60722]: DEBUG oslo_vmware.exceptions [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Fault InvalidArgument not matched. {{(pid=60722) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1131.644441] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1131.644876] env[60722]: ERROR nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1131.644876] env[60722]: Faults: ['InvalidArgument'] [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] Traceback (most recent call last): [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] yield resources [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] self.driver.spawn(context, instance, image_meta, [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] self._fetch_image_if_missing(context, vi) [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] image_cache(vi, tmp_image_ds_loc) [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] vm_util.copy_virtual_disk( [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] session._wait_for_task(vmdk_copy_task) [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] return self.wait_for_task(task_ref) [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] return evt.wait() [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] result = hub.switch() [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] return self.greenlet.switch() [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] self.f(*self.args, **self.kw) [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] raise exceptions.translate_fault(task_info.error) [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] Faults: ['InvalidArgument'] [ 1131.644876] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1131.646266] env[60722]: INFO nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Terminating instance [ 1131.646638] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1131.646838] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1131.647426] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1131.647608] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1131.647811] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1891367b-8d7d-4114-b013-8daa2a62581c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1131.650083] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0cb2d347-c6d2-4da0-a507-9fb50cf765a1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1131.657323] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1131.657674] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1a1163a3-46af-4bf5-87f3-eb6250df38e1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1131.662089] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1131.662089] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1131.662089] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e41a99d8-ac73-4ccf-8f30-91885231e855 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1131.665705] env[60722]: DEBUG oslo_vmware.api [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Waiting for the task: (returnval){ [ 1131.665705] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]528cfeba-a9ee-2605-aa90-9b3f4de690de" [ 1131.665705] env[60722]: _type = "Task" [ 1131.665705] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1131.673278] env[60722]: DEBUG oslo_vmware.api [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]528cfeba-a9ee-2605-aa90-9b3f4de690de, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1131.728997] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1131.729244] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1131.729417] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Deleting the datastore file [datastore1] b9025e22-8080-4887-8e4e-179866f704ca {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1131.729676] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-14c27e06-321d-4ab0-8543-e5a4c3a18c6b {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1131.736704] env[60722]: DEBUG oslo_vmware.api [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Waiting for the task: (returnval){ [ 1131.736704] env[60722]: value = "task-565219" [ 1131.736704] env[60722]: _type = "Task" [ 1131.736704] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1131.744227] env[60722]: DEBUG oslo_vmware.api [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Task: {'id': task-565219, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1132.176168] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1132.176479] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Creating directory with path [datastore1] vmware_temp/20ff304a-c695-48c7-aa8f-89ba3c92b4ab/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1132.176655] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-18be6b74-71bf-421a-99ed-a5aeb1f43775 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1132.189425] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Created directory with path [datastore1] vmware_temp/20ff304a-c695-48c7-aa8f-89ba3c92b4ab/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1132.189609] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Fetch image to [datastore1] vmware_temp/20ff304a-c695-48c7-aa8f-89ba3c92b4ab/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1132.189771] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/20ff304a-c695-48c7-aa8f-89ba3c92b4ab/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1132.190537] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24fe887f-4824-4fb0-a0be-e1878fa4d077 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1132.196935] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5a1b4a9-4eee-417c-ba7b-6edb5bc8af77 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1132.205220] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69ac1eff-210f-444e-b295-5d76d7033d2a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1132.235060] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e70362f-3016-4b43-b74b-e09d94720d47 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1132.245213] env[60722]: DEBUG oslo_vmware.api [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Task: {'id': task-565219, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078678} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1132.246325] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1132.246499] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1132.246666] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1132.246853] env[60722]: INFO nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1132.248781] env[60722]: DEBUG nova.compute.claims [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1132.248941] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1132.249194] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1132.251479] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-331ca90f-bcd6-4541-a0f4-26253bd3b887 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1132.273105] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1132.275927] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1132.276580] env[60722]: DEBUG nova.compute.utils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Instance b9025e22-8080-4887-8e4e-179866f704ca could not be found. {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1132.278062] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Instance disappeared during build. {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1132.278225] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1132.278380] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1132.278539] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1132.278695] env[60722]: DEBUG nova.network.neutron [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1132.349637] env[60722]: DEBUG neutronclient.v2_0.client [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60722) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1132.350953] env[60722]: ERROR nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] Traceback (most recent call last): [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] self.driver.spawn(context, instance, image_meta, [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] self._fetch_image_if_missing(context, vi) [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] image_cache(vi, tmp_image_ds_loc) [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] vm_util.copy_virtual_disk( [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] session._wait_for_task(vmdk_copy_task) [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] return self.wait_for_task(task_ref) [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] return evt.wait() [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] result = hub.switch() [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] return self.greenlet.switch() [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] self.f(*self.args, **self.kw) [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] raise exceptions.translate_fault(task_info.error) [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] Faults: ['InvalidArgument'] [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] During handling of the above exception, another exception occurred: [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] Traceback (most recent call last): [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] self._build_and_run_instance(context, instance, image, [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] with excutils.save_and_reraise_exception(): [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] self.force_reraise() [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1132.350953] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] raise self.value [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] with self.rt.instance_claim(context, instance, node, allocs, [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] self.abort() [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] self.tracker.abort_instance_claim(self.context, self.instance, [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] return f(*args, **kwargs) [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] self._unset_instance_host_and_node(instance) [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] instance.save() [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] updates, result = self.indirection_api.object_action( [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] return cctxt.call(context, 'object_action', objinst=objinst, [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] result = self.transport._send( [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] return self._driver.send(target, ctxt, message, [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] raise result [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] nova.exception_Remote.InstanceNotFound_Remote: Instance b9025e22-8080-4887-8e4e-179866f704ca could not be found. [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] Traceback (most recent call last): [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] return getattr(target, method)(*args, **kwargs) [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] return fn(self, *args, **kwargs) [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] old_ref, inst_ref = db.instance_update_and_get_original( [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] return f(*args, **kwargs) [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1132.352286] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] with excutils.save_and_reraise_exception() as ectxt: [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] self.force_reraise() [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] raise self.value [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] return f(*args, **kwargs) [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] return f(context, *args, **kwargs) [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] raise exception.InstanceNotFound(instance_id=uuid) [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] nova.exception.InstanceNotFound: Instance b9025e22-8080-4887-8e4e-179866f704ca could not be found. [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] During handling of the above exception, another exception occurred: [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] Traceback (most recent call last): [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] ret = obj(*args, **kwargs) [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] exception_handler_v20(status_code, error_body) [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] raise client_exc(message=error_message, [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] Neutron server returns request_ids: ['req-425602e3-b518-4bdc-9096-d4c6cf2ee064'] [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] During handling of the above exception, another exception occurred: [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] Traceback (most recent call last): [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] self._deallocate_network(context, instance, requested_networks) [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] self.network_api.deallocate_for_instance( [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] data = neutron.list_ports(**search_opts) [ 1132.353625] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] ret = obj(*args, **kwargs) [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] return self.list('ports', self.ports_path, retrieve_all, [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] ret = obj(*args, **kwargs) [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] for r in self._pagination(collection, path, **params): [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] res = self.get(path, params=params) [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] ret = obj(*args, **kwargs) [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] return self.retry_request("GET", action, body=body, [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] ret = obj(*args, **kwargs) [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] return self.do_request(method, action, body=body, [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] ret = obj(*args, **kwargs) [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] self._handle_fault_response(status_code, replybody, resp) [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] raise exception.Unauthorized() [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] nova.exception.Unauthorized: Not authorized. [ 1132.354883] env[60722]: ERROR nova.compute.manager [instance: b9025e22-8080-4887-8e4e-179866f704ca] [ 1132.370824] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Lock "b9025e22-8080-4887-8e4e-179866f704ca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 342.932s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1132.439616] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1132.440423] env[60722]: ERROR nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Traceback (most recent call last): [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] result = getattr(controller, method)(*args, **kwargs) [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return self._get(image_id) [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] resp, body = self.http_client.get(url, headers=header) [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return self.request(url, 'GET', **kwargs) [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return self._handle_response(resp) [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] raise exc.from_response(resp, resp.content) [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] During handling of the above exception, another exception occurred: [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Traceback (most recent call last): [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] yield resources [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] self.driver.spawn(context, instance, image_meta, [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] self._fetch_image_if_missing(context, vi) [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] image_fetch(context, vi, tmp_image_ds_loc) [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] images.fetch_image( [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1132.440423] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] metadata = IMAGE_API.get(context, image_ref) [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return session.show(context, image_id, [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] _reraise_translated_image_exception(image_id) [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] raise new_exc.with_traceback(exc_trace) [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] result = getattr(controller, method)(*args, **kwargs) [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return self._get(image_id) [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] resp, body = self.http_client.get(url, headers=header) [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return self.request(url, 'GET', **kwargs) [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return self._handle_response(resp) [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] raise exc.from_response(resp, resp.content) [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1132.441532] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1132.441532] env[60722]: INFO nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Terminating instance [ 1132.443314] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1132.443523] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1132.444187] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1132.444374] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1132.444615] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e4b55dd1-599d-4944-b21a-2e3b734a9092 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1132.447689] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12bc3076-3c96-463e-9dfc-12dc8ae8bdaf {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1132.454847] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1132.455076] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-110ed097-9593-4059-9677-5bfe03516900 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1132.457259] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1132.457427] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1132.458352] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f6e58f3b-cea7-469d-b91b-d0b6ff99e585 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1132.463732] env[60722]: DEBUG oslo_vmware.api [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Waiting for the task: (returnval){ [ 1132.463732] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52e160fb-50a1-ac45-532a-a4cd2d14946f" [ 1132.463732] env[60722]: _type = "Task" [ 1132.463732] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1132.470575] env[60722]: DEBUG oslo_vmware.api [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52e160fb-50a1-ac45-532a-a4cd2d14946f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1132.516030] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1132.516255] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1132.516426] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Deleting the datastore file [datastore1] 020c2b79-e755-4178-aa85-5ecaa31e7a9f {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1132.516662] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2b32a902-f76d-4c89-9d07-5095bfdc4b66 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1132.523236] env[60722]: DEBUG oslo_vmware.api [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Waiting for the task: (returnval){ [ 1132.523236] env[60722]: value = "task-565221" [ 1132.523236] env[60722]: _type = "Task" [ 1132.523236] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1132.531899] env[60722]: DEBUG oslo_vmware.api [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Task: {'id': task-565221, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1132.973828] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1132.974510] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Creating directory with path [datastore1] vmware_temp/41c80e67-8481-4f2c-b26f-1347b28843ba/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1132.974510] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ca6e4d34-dee8-412c-8050-2b54e2ebc3e7 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1132.985701] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Created directory with path [datastore1] vmware_temp/41c80e67-8481-4f2c-b26f-1347b28843ba/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1132.985875] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Fetch image to [datastore1] vmware_temp/41c80e67-8481-4f2c-b26f-1347b28843ba/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1132.986045] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/41c80e67-8481-4f2c-b26f-1347b28843ba/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1132.987046] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71fbc049-4929-4569-865a-7a0990c2d508 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1132.993242] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32b489c9-1760-4fe8-b8c8-8ad3b0766972 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1133.001805] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31884763-e4ea-418c-8fcb-4d9f08488e22 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1133.035086] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-759c7b0f-4445-464c-9a06-90091abb9986 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1133.041857] env[60722]: DEBUG oslo_vmware.api [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Task: {'id': task-565221, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07823} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1133.043276] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1133.043453] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1133.043627] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1133.043791] env[60722]: INFO nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1133.045548] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-acf84199-58d8-4a4a-b26f-6a7b092fd116 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1133.047380] env[60722]: DEBUG nova.compute.claims [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1133.047538] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1133.047737] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1133.069092] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1133.076964] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1133.076964] env[60722]: DEBUG nova.compute.utils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Instance 020c2b79-e755-4178-aa85-5ecaa31e7a9f could not be found. {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1133.077125] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Instance disappeared during build. {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1133.077191] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1133.077341] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1133.077494] env[60722]: DEBUG nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1133.077642] env[60722]: DEBUG nova.network.neutron [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1133.209726] env[60722]: DEBUG neutronclient.v2_0.client [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60722) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1133.211541] env[60722]: ERROR nova.compute.manager [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Traceback (most recent call last): [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] result = getattr(controller, method)(*args, **kwargs) [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return self._get(image_id) [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] resp, body = self.http_client.get(url, headers=header) [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return self.request(url, 'GET', **kwargs) [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return self._handle_response(resp) [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] raise exc.from_response(resp, resp.content) [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] During handling of the above exception, another exception occurred: [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Traceback (most recent call last): [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] self.driver.spawn(context, instance, image_meta, [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] self._fetch_image_if_missing(context, vi) [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] image_fetch(context, vi, tmp_image_ds_loc) [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] images.fetch_image( [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] metadata = IMAGE_API.get(context, image_ref) [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return session.show(context, image_id, [ 1133.211541] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] _reraise_translated_image_exception(image_id) [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] raise new_exc.with_traceback(exc_trace) [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] result = getattr(controller, method)(*args, **kwargs) [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return self._get(image_id) [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] resp, body = self.http_client.get(url, headers=header) [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return self.request(url, 'GET', **kwargs) [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return self._handle_response(resp) [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] raise exc.from_response(resp, resp.content) [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] During handling of the above exception, another exception occurred: [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Traceback (most recent call last): [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] self._build_and_run_instance(context, instance, image, [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] with excutils.save_and_reraise_exception(): [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] self.force_reraise() [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] raise self.value [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] with self.rt.instance_claim(context, instance, node, allocs, [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] self.abort() [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] self.tracker.abort_instance_claim(self.context, self.instance, [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1133.212775] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return f(*args, **kwargs) [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] self._unset_instance_host_and_node(instance) [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] instance.save() [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] updates, result = self.indirection_api.object_action( [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return cctxt.call(context, 'object_action', objinst=objinst, [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] result = self.transport._send( [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return self._driver.send(target, ctxt, message, [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] raise result [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] nova.exception_Remote.InstanceNotFound_Remote: Instance 020c2b79-e755-4178-aa85-5ecaa31e7a9f could not be found. [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Traceback (most recent call last): [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return getattr(target, method)(*args, **kwargs) [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return fn(self, *args, **kwargs) [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] old_ref, inst_ref = db.instance_update_and_get_original( [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return f(*args, **kwargs) [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] with excutils.save_and_reraise_exception() as ectxt: [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] self.force_reraise() [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] raise self.value [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return f(*args, **kwargs) [ 1133.213794] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return f(context, *args, **kwargs) [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] raise exception.InstanceNotFound(instance_id=uuid) [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] nova.exception.InstanceNotFound: Instance 020c2b79-e755-4178-aa85-5ecaa31e7a9f could not be found. [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] During handling of the above exception, another exception occurred: [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Traceback (most recent call last): [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] ret = obj(*args, **kwargs) [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] exception_handler_v20(status_code, error_body) [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] raise client_exc(message=error_message, [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Neutron server returns request_ids: ['req-e5185cb7-1f48-4a39-b929-6df7ba8c5fa6'] [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] During handling of the above exception, another exception occurred: [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Traceback (most recent call last): [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] self._deallocate_network(context, instance, requested_networks) [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] self.network_api.deallocate_for_instance( [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] data = neutron.list_ports(**search_opts) [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] ret = obj(*args, **kwargs) [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return self.list('ports', self.ports_path, retrieve_all, [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] ret = obj(*args, **kwargs) [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1133.215274] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] for r in self._pagination(collection, path, **params): [ 1133.216878] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1133.216878] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] res = self.get(path, params=params) [ 1133.216878] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1133.216878] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] ret = obj(*args, **kwargs) [ 1133.216878] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1133.216878] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return self.retry_request("GET", action, body=body, [ 1133.216878] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1133.216878] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] ret = obj(*args, **kwargs) [ 1133.216878] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1133.216878] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] return self.do_request(method, action, body=body, [ 1133.216878] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1133.216878] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] ret = obj(*args, **kwargs) [ 1133.216878] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1133.216878] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] self._handle_fault_response(status_code, replybody, resp) [ 1133.216878] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1133.216878] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] raise exception.Unauthorized() [ 1133.216878] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] nova.exception.Unauthorized: Not authorized. [ 1133.216878] env[60722]: ERROR nova.compute.manager [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] [ 1133.227937] env[60722]: DEBUG oslo_vmware.rw_handles [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/41c80e67-8481-4f2c-b26f-1347b28843ba/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1133.282335] env[60722]: DEBUG oslo_concurrency.lockutils [None req-a94d3328-8687-4577-a692-e48ebb600d1c tempest-MultipleCreateTestJSON-922706562 tempest-MultipleCreateTestJSON-922706562-project-member] Lock "020c2b79-e755-4178-aa85-5ecaa31e7a9f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 343.820s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1133.285828] env[60722]: DEBUG oslo_vmware.rw_handles [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Completed reading data from the image iterator. {{(pid=60722) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1133.285952] env[60722]: DEBUG oslo_vmware.rw_handles [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/41c80e67-8481-4f2c-b26f-1347b28843ba/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1139.944950] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1139.944950] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1139.944950] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1139.944950] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60722) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1139.945759] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1139.945759] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Cleaning up deleted instances with incomplete migration {{(pid=60722) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11133}} [ 1142.953263] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1142.953587] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1143.939943] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1143.943576] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1143.953587] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1143.953867] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1143.953910] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1143.954047] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60722) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1143.955175] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bac32727-6101-4b00-93d6-06e419f647ee {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1143.963349] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74fe5c86-6324-4641-9cb0-1abce39da60a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1143.977930] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ec973d4-233b-44dc-9d59-eb5d9d3e1877 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1143.984277] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7cfa592-0bb2-495f-9bb9-bcb10898d796 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.014906] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181629MB free_disk=100GB free_vcpus=48 pci_devices=None {{(pid=60722) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1144.015099] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1144.015173] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1144.110398] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1144.110656] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1144.126924] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Refreshing inventories for resource provider 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1144.140118] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Updating ProviderTree inventory for provider 6d7f336b-9351-4171-8197-866cdafbab42 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1144.140306] env[60722]: DEBUG nova.compute.provider_tree [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Updating inventory in ProviderTree for provider 6d7f336b-9351-4171-8197-866cdafbab42 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1144.151049] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Refreshing aggregate associations for resource provider 6d7f336b-9351-4171-8197-866cdafbab42, aggregates: None {{(pid=60722) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1144.167648] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Refreshing trait associations for resource provider 6d7f336b-9351-4171-8197-866cdafbab42, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SAME_HOST_COLD_MIGRATE {{(pid=60722) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1144.180802] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5669755-3e4c-419d-960f-246752367821 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.188284] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5dc9ed1c-7a11-4f87-9a2d-dd438027c763 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.219391] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-902e5d28-09d6-4fda-ad3a-e435887363e3 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.226798] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2ba2280-28f5-458d-97d1-40cc698cafa2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1144.239949] env[60722]: DEBUG nova.compute.provider_tree [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1144.249098] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1144.263945] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60722) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1144.264166] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.249s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1144.264355] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1144.264514] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Cleaning up deleted instances {{(pid=60722) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11095}} [ 1144.293630] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] There are 10 instances to clean {{(pid=60722) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11104}} [ 1144.293935] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Instance has had 0 of 5 cleanup attempts {{(pid=60722) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1144.336618] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Instance has had 0 of 5 cleanup attempts {{(pid=60722) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1144.377123] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 020c2b79-e755-4178-aa85-5ecaa31e7a9f] Instance has had 0 of 5 cleanup attempts {{(pid=60722) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1144.396596] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: b9025e22-8080-4887-8e4e-179866f704ca] Instance has had 0 of 5 cleanup attempts {{(pid=60722) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1144.414963] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 019db29d-b8e4-4592-b7c4-2c044e2b2a51] Instance has had 0 of 5 cleanup attempts {{(pid=60722) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1144.432873] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 2786801d-6211-4598-b357-4f0a0ffdd7d1] Instance has had 0 of 5 cleanup attempts {{(pid=60722) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1144.452403] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 1c4b8597-88ec-4e79-a749-f802803a5ffe] Instance has had 0 of 5 cleanup attempts {{(pid=60722) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1144.470599] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 22463917-2185-42f7-87b7-2b720be45c22] Instance has had 0 of 5 cleanup attempts {{(pid=60722) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1144.488316] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: 4e66f1dc-18c6-4d64-9bbe-9b061e795a65] Instance has had 0 of 5 cleanup attempts {{(pid=60722) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1144.505523] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: eae8d9ce-9fe3-411e-9fd8-05920fb0af04] Instance has had 0 of 5 cleanup attempts {{(pid=60722) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1144.944263] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1144.944516] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1147.946938] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1147.956387] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1147.956539] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Starting heal instance info cache {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1147.956683] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Rebuilding the list of instances to heal {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1147.964252] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Didn't find any instances for network info cache update. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1158.927375] env[60722]: DEBUG oslo_concurrency.lockutils [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Acquiring lock "4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1158.927882] env[60722]: DEBUG oslo_concurrency.lockutils [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Lock "4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1158.939405] env[60722]: DEBUG nova.compute.manager [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1158.981496] env[60722]: DEBUG oslo_concurrency.lockutils [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1158.981712] env[60722]: DEBUG oslo_concurrency.lockutils [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1158.983226] env[60722]: INFO nova.compute.claims [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1159.047105] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3648abaf-a657-4daf-ae87-ce218cdd01a7 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1159.054289] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e1f531d-608c-474b-b5e0-7fd763d5cfb9 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1159.083988] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32da39ed-7018-4b2c-8e91-ab8e3a1754ab {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1159.090478] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ac0c75d-2089-4dc7-9b43-d0c414ebe4e6 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1159.102951] env[60722]: DEBUG nova.compute.provider_tree [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1159.110268] env[60722]: DEBUG nova.scheduler.client.report [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1159.124028] env[60722]: DEBUG oslo_concurrency.lockutils [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.142s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1159.124366] env[60722]: DEBUG nova.compute.manager [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1159.154302] env[60722]: DEBUG nova.compute.utils [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1159.155526] env[60722]: DEBUG nova.compute.manager [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1159.155689] env[60722]: DEBUG nova.network.neutron [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1159.162917] env[60722]: DEBUG nova.compute.manager [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1159.191330] env[60722]: INFO nova.virt.block_device [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Booting with volume cdb00472-d087-470b-bc90-3c9a91203f67 at /dev/sda [ 1159.207992] env[60722]: DEBUG nova.policy [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6c49f7d3ddb940edab756cd68809a49a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dd58404b6abc4a7b879fa0ef70e64beb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 1159.233946] env[60722]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-bf335dca-6e82-45e1-acb5-44b602145e90 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1159.245300] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d4d0755-1667-4236-8166-74f38027480e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1159.267314] env[60722]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fa4c93e3-24af-433b-89b7-30df35fe969e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1159.273960] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3a2f0d8-13d6-4c33-90f5-4e8acb2603a4 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1159.294970] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5cfd4add-3600-4a89-ab03-ec9cb1b6ce8d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1159.301130] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-679a0883-9cab-4bf4-ac16-9d7e8c1c290a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1159.314149] env[60722]: DEBUG nova.virt.block_device [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Updating existing volume attachment record: b21e2d0b-a889-4978-b7e6-ef92daf496cf {{(pid=60722) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 1159.487567] env[60722]: DEBUG nova.network.neutron [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Successfully created port: dba6226f-09ef-4871-9f85-78b3464b9af5 {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1159.527742] env[60722]: DEBUG nova.compute.manager [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1159.528264] env[60722]: DEBUG nova.virt.hardware [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1159.528466] env[60722]: DEBUG nova.virt.hardware [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1159.528613] env[60722]: DEBUG nova.virt.hardware [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1159.528784] env[60722]: DEBUG nova.virt.hardware [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1159.528923] env[60722]: DEBUG nova.virt.hardware [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1159.529074] env[60722]: DEBUG nova.virt.hardware [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1159.529284] env[60722]: DEBUG nova.virt.hardware [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1159.529433] env[60722]: DEBUG nova.virt.hardware [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1159.529593] env[60722]: DEBUG nova.virt.hardware [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1159.529752] env[60722]: DEBUG nova.virt.hardware [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1159.529921] env[60722]: DEBUG nova.virt.hardware [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1159.531280] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-675461dc-597f-4c3c-8533-bce9bb53ff3e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1159.539395] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d401dd00-8659-4125-8904-c4cf50f1d5e2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1159.945263] env[60722]: DEBUG nova.compute.manager [req-28437202-72ba-4556-b85e-6a2d73f7a543 req-7e6bc94c-9665-4f3f-90a4-8a24d97c78eb service nova] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Received event network-vif-plugged-dba6226f-09ef-4871-9f85-78b3464b9af5 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1159.945526] env[60722]: DEBUG oslo_concurrency.lockutils [req-28437202-72ba-4556-b85e-6a2d73f7a543 req-7e6bc94c-9665-4f3f-90a4-8a24d97c78eb service nova] Acquiring lock "4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1159.945718] env[60722]: DEBUG oslo_concurrency.lockutils [req-28437202-72ba-4556-b85e-6a2d73f7a543 req-7e6bc94c-9665-4f3f-90a4-8a24d97c78eb service nova] Lock "4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1159.945881] env[60722]: DEBUG oslo_concurrency.lockutils [req-28437202-72ba-4556-b85e-6a2d73f7a543 req-7e6bc94c-9665-4f3f-90a4-8a24d97c78eb service nova] Lock "4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1159.947525] env[60722]: DEBUG nova.compute.manager [req-28437202-72ba-4556-b85e-6a2d73f7a543 req-7e6bc94c-9665-4f3f-90a4-8a24d97c78eb service nova] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] No waiting events found dispatching network-vif-plugged-dba6226f-09ef-4871-9f85-78b3464b9af5 {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1159.947770] env[60722]: WARNING nova.compute.manager [req-28437202-72ba-4556-b85e-6a2d73f7a543 req-7e6bc94c-9665-4f3f-90a4-8a24d97c78eb service nova] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Received unexpected event network-vif-plugged-dba6226f-09ef-4871-9f85-78b3464b9af5 for instance with vm_state building and task_state spawning. [ 1160.014348] env[60722]: DEBUG nova.network.neutron [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Successfully updated port: dba6226f-09ef-4871-9f85-78b3464b9af5 {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1160.023525] env[60722]: DEBUG oslo_concurrency.lockutils [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Acquiring lock "refresh_cache-4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1160.023720] env[60722]: DEBUG oslo_concurrency.lockutils [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Acquired lock "refresh_cache-4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1160.023885] env[60722]: DEBUG nova.network.neutron [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1160.061825] env[60722]: DEBUG nova.network.neutron [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1160.205376] env[60722]: DEBUG nova.network.neutron [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Updating instance_info_cache with network_info: [{"id": "dba6226f-09ef-4871-9f85-78b3464b9af5", "address": "fa:16:3e:cb:b5:91", "network": {"id": "c6156416-f75e-45d4-b55b-79076c3790b5", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-854715999-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dd58404b6abc4a7b879fa0ef70e64beb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7b2049d7-f99e-425a-afdb-2c95ca88e483", "external-id": "nsx-vlan-transportzone-803", "segmentation_id": 803, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdba6226f-09", "ovs_interfaceid": "dba6226f-09ef-4871-9f85-78b3464b9af5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1160.216637] env[60722]: DEBUG oslo_concurrency.lockutils [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Releasing lock "refresh_cache-4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1160.216914] env[60722]: DEBUG nova.compute.manager [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Instance network_info: |[{"id": "dba6226f-09ef-4871-9f85-78b3464b9af5", "address": "fa:16:3e:cb:b5:91", "network": {"id": "c6156416-f75e-45d4-b55b-79076c3790b5", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-854715999-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dd58404b6abc4a7b879fa0ef70e64beb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7b2049d7-f99e-425a-afdb-2c95ca88e483", "external-id": "nsx-vlan-transportzone-803", "segmentation_id": 803, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdba6226f-09", "ovs_interfaceid": "dba6226f-09ef-4871-9f85-78b3464b9af5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1160.217283] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:cb:b5:91', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7b2049d7-f99e-425a-afdb-2c95ca88e483', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'dba6226f-09ef-4871-9f85-78b3464b9af5', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1160.224648] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Creating folder: Project (dd58404b6abc4a7b879fa0ef70e64beb). Parent ref: group-v141606. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1160.225119] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a6ba7782-b98b-4f0a-b6da-8c3c93aed110 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1160.238216] env[60722]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 1160.238333] env[60722]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=60722) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 1160.238588] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Folder already exists: Project (dd58404b6abc4a7b879fa0ef70e64beb). Parent ref: group-v141606. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 1160.238763] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Creating folder: Instances. Parent ref: group-v141661. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1160.238958] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-57763970-07e6-40f1-9f7a-9792cc08c96d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1160.246998] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Created folder: Instances in parent group-v141661. [ 1160.247218] env[60722]: DEBUG oslo.service.loopingcall [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1160.247376] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1160.247546] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e241c28f-27f5-48f6-a76a-106b86533405 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1160.265119] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1160.265119] env[60722]: value = "task-565234" [ 1160.265119] env[60722]: _type = "Task" [ 1160.265119] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1160.271897] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565234, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1160.775062] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565234, 'name': CreateVM_Task, 'duration_secs': 0.287067} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1160.775239] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1160.775804] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'device_type': None, 'boot_index': 0, 'attachment_id': 'b21e2d0b-a889-4978-b7e6-ef92daf496cf', 'disk_bus': None, 'guest_format': None, 'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-141664', 'volume_id': 'cdb00472-d087-470b-bc90-3c9a91203f67', 'name': 'volume-cdb00472-d087-470b-bc90-3c9a91203f67', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa', 'attached_at': '', 'detached_at': '', 'volume_id': 'cdb00472-d087-470b-bc90-3c9a91203f67', 'serial': 'cdb00472-d087-470b-bc90-3c9a91203f67'}, 'mount_device': '/dev/sda', 'delete_on_termination': True, 'volume_type': None}], 'swap': None} {{(pid=60722) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 1160.776018] env[60722]: DEBUG nova.virt.vmwareapi.volumeops [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Root volume attach. Driver type: vmdk {{(pid=60722) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 1160.776734] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f8243c9-d1c0-4bb1-b7af-e921114cd944 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1160.783891] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c858ea22-65fa-4bfd-9028-258389510799 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1160.789204] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70460d7e-e231-48d4-ae76-aea9a994c558 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1160.794759] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-56b11d16-f1d4-4662-80dd-d04106953ba6 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1160.800612] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Waiting for the task: (returnval){ [ 1160.800612] env[60722]: value = "task-565235" [ 1160.800612] env[60722]: _type = "Task" [ 1160.800612] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1160.807552] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565235, 'name': RelocateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1161.312936] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565235, 'name': RelocateVM_Task} progress is 42%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1161.813161] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565235, 'name': RelocateVM_Task} progress is 56%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1161.975126] env[60722]: DEBUG nova.compute.manager [req-cb5415a4-4534-4198-899f-9c5ff912cd91 req-40ee8def-b6b6-4a74-a07e-a136013c76e7 service nova] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Received event network-changed-dba6226f-09ef-4871-9f85-78b3464b9af5 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1161.975320] env[60722]: DEBUG nova.compute.manager [req-cb5415a4-4534-4198-899f-9c5ff912cd91 req-40ee8def-b6b6-4a74-a07e-a136013c76e7 service nova] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Refreshing instance network info cache due to event network-changed-dba6226f-09ef-4871-9f85-78b3464b9af5. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 1161.975531] env[60722]: DEBUG oslo_concurrency.lockutils [req-cb5415a4-4534-4198-899f-9c5ff912cd91 req-40ee8def-b6b6-4a74-a07e-a136013c76e7 service nova] Acquiring lock "refresh_cache-4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1161.975668] env[60722]: DEBUG oslo_concurrency.lockutils [req-cb5415a4-4534-4198-899f-9c5ff912cd91 req-40ee8def-b6b6-4a74-a07e-a136013c76e7 service nova] Acquired lock "refresh_cache-4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1161.975822] env[60722]: DEBUG nova.network.neutron [req-cb5415a4-4534-4198-899f-9c5ff912cd91 req-40ee8def-b6b6-4a74-a07e-a136013c76e7 service nova] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Refreshing network info cache for port dba6226f-09ef-4871-9f85-78b3464b9af5 {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1162.277020] env[60722]: DEBUG nova.network.neutron [req-cb5415a4-4534-4198-899f-9c5ff912cd91 req-40ee8def-b6b6-4a74-a07e-a136013c76e7 service nova] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Updated VIF entry in instance network info cache for port dba6226f-09ef-4871-9f85-78b3464b9af5. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1162.277396] env[60722]: DEBUG nova.network.neutron [req-cb5415a4-4534-4198-899f-9c5ff912cd91 req-40ee8def-b6b6-4a74-a07e-a136013c76e7 service nova] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Updating instance_info_cache with network_info: [{"id": "dba6226f-09ef-4871-9f85-78b3464b9af5", "address": "fa:16:3e:cb:b5:91", "network": {"id": "c6156416-f75e-45d4-b55b-79076c3790b5", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-854715999-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dd58404b6abc4a7b879fa0ef70e64beb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7b2049d7-f99e-425a-afdb-2c95ca88e483", "external-id": "nsx-vlan-transportzone-803", "segmentation_id": 803, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdba6226f-09", "ovs_interfaceid": "dba6226f-09ef-4871-9f85-78b3464b9af5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1162.288285] env[60722]: DEBUG oslo_concurrency.lockutils [req-cb5415a4-4534-4198-899f-9c5ff912cd91 req-40ee8def-b6b6-4a74-a07e-a136013c76e7 service nova] Releasing lock "refresh_cache-4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1162.316924] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565235, 'name': RelocateVM_Task} progress is 71%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1162.816026] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565235, 'name': RelocateVM_Task} progress is 88%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1163.315157] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565235, 'name': RelocateVM_Task} progress is 97%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1163.815090] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565235, 'name': RelocateVM_Task, 'duration_secs': 2.921562} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1163.815498] env[60722]: DEBUG nova.virt.vmwareapi.volumeops [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Volume attach. Driver type: vmdk {{(pid=60722) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 1163.815720] env[60722]: DEBUG nova.virt.vmwareapi.volumeops [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-141664', 'volume_id': 'cdb00472-d087-470b-bc90-3c9a91203f67', 'name': 'volume-cdb00472-d087-470b-bc90-3c9a91203f67', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa', 'attached_at': '', 'detached_at': '', 'volume_id': 'cdb00472-d087-470b-bc90-3c9a91203f67', 'serial': 'cdb00472-d087-470b-bc90-3c9a91203f67'} {{(pid=60722) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 1163.816714] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33a4f645-1404-470a-9e52-55c99d6cc3d0 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1163.834635] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31c3497c-4b0d-45b4-8742-e93522c9411e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1163.856395] env[60722]: DEBUG nova.virt.vmwareapi.volumeops [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Reconfiguring VM instance instance-0000001d to attach disk [datastore1] volume-cdb00472-d087-470b-bc90-3c9a91203f67/volume-cdb00472-d087-470b-bc90-3c9a91203f67.vmdk or device None with type thin {{(pid=60722) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 1163.856617] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-7a63d854-2a8e-4946-bf19-4b1586becbb1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1163.876226] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Waiting for the task: (returnval){ [ 1163.876226] env[60722]: value = "task-565236" [ 1163.876226] env[60722]: _type = "Task" [ 1163.876226] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1163.886207] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565236, 'name': ReconfigVM_Task} progress is 5%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1164.386707] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565236, 'name': ReconfigVM_Task, 'duration_secs': 0.282661} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1164.387056] env[60722]: DEBUG nova.virt.vmwareapi.volumeops [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Reconfigured VM instance instance-0000001d to attach disk [datastore1] volume-cdb00472-d087-470b-bc90-3c9a91203f67/volume-cdb00472-d087-470b-bc90-3c9a91203f67.vmdk or device None with type thin {{(pid=60722) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 1164.391613] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-84f8ab03-da6e-4d1b-a079-703abbde916b {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1164.406758] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Waiting for the task: (returnval){ [ 1164.406758] env[60722]: value = "task-565237" [ 1164.406758] env[60722]: _type = "Task" [ 1164.406758] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1164.414330] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565237, 'name': ReconfigVM_Task} progress is 5%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1164.916867] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565237, 'name': ReconfigVM_Task, 'duration_secs': 0.127583} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1164.917268] env[60722]: DEBUG nova.virt.vmwareapi.volumeops [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-141664', 'volume_id': 'cdb00472-d087-470b-bc90-3c9a91203f67', 'name': 'volume-cdb00472-d087-470b-bc90-3c9a91203f67', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa', 'attached_at': '', 'detached_at': '', 'volume_id': 'cdb00472-d087-470b-bc90-3c9a91203f67', 'serial': 'cdb00472-d087-470b-bc90-3c9a91203f67'} {{(pid=60722) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 1164.917815] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-9cf48eed-a65e-4d5e-ab1c-7a6b5d6ebeb0 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1164.926785] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Waiting for the task: (returnval){ [ 1164.926785] env[60722]: value = "task-565238" [ 1164.926785] env[60722]: _type = "Task" [ 1164.926785] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1164.934598] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565238, 'name': Rename_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1165.436960] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565238, 'name': Rename_Task, 'duration_secs': 0.138502} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1165.437226] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Powering on the VM {{(pid=60722) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 1165.437442] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-ed158b2e-ac8a-4a54-ae31-7cee0b688d3e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1165.443865] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Waiting for the task: (returnval){ [ 1165.443865] env[60722]: value = "task-565239" [ 1165.443865] env[60722]: _type = "Task" [ 1165.443865] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1165.452355] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565239, 'name': PowerOnVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1165.954350] env[60722]: DEBUG oslo_vmware.api [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565239, 'name': PowerOnVM_Task, 'duration_secs': 0.42243} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1165.954691] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Powered on the VM {{(pid=60722) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 1165.954757] env[60722]: INFO nova.compute.manager [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Took 6.43 seconds to spawn the instance on the hypervisor. [ 1165.954984] env[60722]: DEBUG nova.compute.manager [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Checking state {{(pid=60722) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 1165.955732] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a138c13-b939-43d7-afb3-eda22f0010ee {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1166.003322] env[60722]: INFO nova.compute.manager [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Took 7.03 seconds to build instance. [ 1166.013210] env[60722]: DEBUG oslo_concurrency.lockutils [None req-f7bd9200-89a3-4cce-a9a4-64b5b027fcac tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Lock "4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.085s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1167.375479] env[60722]: DEBUG nova.compute.manager [req-bf0176da-e56a-43d0-a850-5dbb7a6d17c6 req-470af155-2e72-4597-ac86-6af1064cd963 service nova] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Received event network-changed-dba6226f-09ef-4871-9f85-78b3464b9af5 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1167.375753] env[60722]: DEBUG nova.compute.manager [req-bf0176da-e56a-43d0-a850-5dbb7a6d17c6 req-470af155-2e72-4597-ac86-6af1064cd963 service nova] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Refreshing instance network info cache due to event network-changed-dba6226f-09ef-4871-9f85-78b3464b9af5. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 1167.375919] env[60722]: DEBUG oslo_concurrency.lockutils [req-bf0176da-e56a-43d0-a850-5dbb7a6d17c6 req-470af155-2e72-4597-ac86-6af1064cd963 service nova] Acquiring lock "refresh_cache-4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1167.376033] env[60722]: DEBUG oslo_concurrency.lockutils [req-bf0176da-e56a-43d0-a850-5dbb7a6d17c6 req-470af155-2e72-4597-ac86-6af1064cd963 service nova] Acquired lock "refresh_cache-4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1167.376414] env[60722]: DEBUG nova.network.neutron [req-bf0176da-e56a-43d0-a850-5dbb7a6d17c6 req-470af155-2e72-4597-ac86-6af1064cd963 service nova] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Refreshing network info cache for port dba6226f-09ef-4871-9f85-78b3464b9af5 {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1167.654224] env[60722]: DEBUG nova.network.neutron [req-bf0176da-e56a-43d0-a850-5dbb7a6d17c6 req-470af155-2e72-4597-ac86-6af1064cd963 service nova] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Updated VIF entry in instance network info cache for port dba6226f-09ef-4871-9f85-78b3464b9af5. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1167.654571] env[60722]: DEBUG nova.network.neutron [req-bf0176da-e56a-43d0-a850-5dbb7a6d17c6 req-470af155-2e72-4597-ac86-6af1064cd963 service nova] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Updating instance_info_cache with network_info: [{"id": "dba6226f-09ef-4871-9f85-78b3464b9af5", "address": "fa:16:3e:cb:b5:91", "network": {"id": "c6156416-f75e-45d4-b55b-79076c3790b5", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-854715999-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dd58404b6abc4a7b879fa0ef70e64beb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7b2049d7-f99e-425a-afdb-2c95ca88e483", "external-id": "nsx-vlan-transportzone-803", "segmentation_id": 803, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdba6226f-09", "ovs_interfaceid": "dba6226f-09ef-4871-9f85-78b3464b9af5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1167.664793] env[60722]: DEBUG oslo_concurrency.lockutils [req-bf0176da-e56a-43d0-a850-5dbb7a6d17c6 req-470af155-2e72-4597-ac86-6af1064cd963 service nova] Releasing lock "refresh_cache-4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1183.286330] env[60722]: WARNING oslo_vmware.rw_handles [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1183.286330] env[60722]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1183.286330] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1183.286330] env[60722]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1183.286330] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1183.286330] env[60722]: ERROR oslo_vmware.rw_handles response.begin() [ 1183.286330] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1183.286330] env[60722]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1183.286330] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1183.286330] env[60722]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1183.286330] env[60722]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1183.286330] env[60722]: ERROR oslo_vmware.rw_handles [ 1183.286991] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Downloaded image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to vmware_temp/41c80e67-8481-4f2c-b26f-1347b28843ba/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1183.288382] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Caching image {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1183.288620] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Copying Virtual Disk [datastore1] vmware_temp/41c80e67-8481-4f2c-b26f-1347b28843ba/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk to [datastore1] vmware_temp/41c80e67-8481-4f2c-b26f-1347b28843ba/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk {{(pid=60722) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1183.288888] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-208e5400-c63a-4259-9ef1-ea8594d39134 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1183.296464] env[60722]: DEBUG oslo_vmware.api [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Waiting for the task: (returnval){ [ 1183.296464] env[60722]: value = "task-565240" [ 1183.296464] env[60722]: _type = "Task" [ 1183.296464] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1183.304130] env[60722]: DEBUG oslo_vmware.api [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Task: {'id': task-565240, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1183.806663] env[60722]: DEBUG oslo_vmware.exceptions [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Fault InvalidArgument not matched. {{(pid=60722) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1183.806907] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1183.807440] env[60722]: ERROR nova.compute.manager [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1183.807440] env[60722]: Faults: ['InvalidArgument'] [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Traceback (most recent call last): [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] yield resources [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] self.driver.spawn(context, instance, image_meta, [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] self._fetch_image_if_missing(context, vi) [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] image_cache(vi, tmp_image_ds_loc) [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] vm_util.copy_virtual_disk( [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] session._wait_for_task(vmdk_copy_task) [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] return self.wait_for_task(task_ref) [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] return evt.wait() [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] result = hub.switch() [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] return self.greenlet.switch() [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] self.f(*self.args, **self.kw) [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] raise exceptions.translate_fault(task_info.error) [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Faults: ['InvalidArgument'] [ 1183.807440] env[60722]: ERROR nova.compute.manager [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] [ 1183.808637] env[60722]: INFO nova.compute.manager [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Terminating instance [ 1183.809218] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1183.809417] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1183.809644] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cc8dfd38-a8a6-445a-b872-1846d6c6d25b {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1183.811729] env[60722]: DEBUG nova.compute.manager [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1183.811907] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1183.812602] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73f33182-adb7-4c14-b0e7-47e211b28143 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1183.819032] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1183.819223] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-dbe97940-b054-4040-b273-0ea5c168012c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1183.821228] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1183.821391] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1183.822282] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bb93b995-cd80-41b9-bcbd-308033e321d7 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1183.827072] env[60722]: DEBUG oslo_vmware.api [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Waiting for the task: (returnval){ [ 1183.827072] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]521b79a4-0dd9-1483-39a5-3a4f002b2754" [ 1183.827072] env[60722]: _type = "Task" [ 1183.827072] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1183.838392] env[60722]: DEBUG oslo_vmware.api [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]521b79a4-0dd9-1483-39a5-3a4f002b2754, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1183.877963] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1183.878184] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1183.878364] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Deleting the datastore file [datastore1] 8d78f310-a2f2-4073-8371-afc42cc566f2 {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1183.878601] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a7a30764-29a9-48d8-9c0c-c64e03441843 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1183.884236] env[60722]: DEBUG oslo_vmware.api [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Waiting for the task: (returnval){ [ 1183.884236] env[60722]: value = "task-565242" [ 1183.884236] env[60722]: _type = "Task" [ 1183.884236] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1183.891379] env[60722]: DEBUG oslo_vmware.api [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Task: {'id': task-565242, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1184.337411] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1184.337859] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Creating directory with path [datastore1] vmware_temp/f53ebaa7-6c22-4c56-8fd3-7e8fe2d81683/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1184.337859] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-26c7101d-70f6-43bf-9429-40cc0b63e805 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.348864] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Created directory with path [datastore1] vmware_temp/f53ebaa7-6c22-4c56-8fd3-7e8fe2d81683/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1184.349054] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Fetch image to [datastore1] vmware_temp/f53ebaa7-6c22-4c56-8fd3-7e8fe2d81683/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1184.349224] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/f53ebaa7-6c22-4c56-8fd3-7e8fe2d81683/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1184.349901] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2ba2847-a7fb-4cea-9ebe-79b63260e666 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.356097] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38f3ad35-68d4-4156-8b63-b9d5ac329214 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.364586] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00da262e-1b21-4294-89b1-d55e13c90a52 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.397575] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-145f4679-bf2e-4dcf-bee4-b078587a20c5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.405656] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-737bdbdf-aaab-4965-a18e-22e5ac7603ee {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.407267] env[60722]: DEBUG oslo_vmware.api [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Task: {'id': task-565242, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076537} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1184.407493] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1184.407663] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1184.407826] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1184.407993] env[60722]: INFO nova.compute.manager [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1184.410073] env[60722]: DEBUG nova.compute.claims [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1184.410264] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1184.410517] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1184.428051] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1184.434025] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.023s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1184.434753] env[60722]: DEBUG nova.compute.utils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Instance 8d78f310-a2f2-4073-8371-afc42cc566f2 could not be found. {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1184.436192] env[60722]: DEBUG nova.compute.manager [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Instance disappeared during build. {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1184.436358] env[60722]: DEBUG nova.compute.manager [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1184.436517] env[60722]: DEBUG nova.compute.manager [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1184.436677] env[60722]: DEBUG nova.compute.manager [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1184.436832] env[60722]: DEBUG nova.network.neutron [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1184.459652] env[60722]: DEBUG nova.network.neutron [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1184.468183] env[60722]: INFO nova.compute.manager [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] [instance: 8d78f310-a2f2-4073-8371-afc42cc566f2] Took 0.03 seconds to deallocate network for instance. [ 1184.509438] env[60722]: DEBUG oslo_concurrency.lockutils [None req-1c04428f-bcff-4f19-a264-6dbe05b521dc tempest-ServerMetadataTestJSON-676342327 tempest-ServerMetadataTestJSON-676342327-project-member] Lock "8d78f310-a2f2-4073-8371-afc42cc566f2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 278.755s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1184.549364] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1184.550170] env[60722]: ERROR nova.compute.manager [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Traceback (most recent call last): [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] result = getattr(controller, method)(*args, **kwargs) [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return self._get(image_id) [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] resp, body = self.http_client.get(url, headers=header) [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return self.request(url, 'GET', **kwargs) [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return self._handle_response(resp) [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] raise exc.from_response(resp, resp.content) [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] During handling of the above exception, another exception occurred: [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Traceback (most recent call last): [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] yield resources [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] self.driver.spawn(context, instance, image_meta, [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] self._fetch_image_if_missing(context, vi) [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] image_fetch(context, vi, tmp_image_ds_loc) [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] images.fetch_image( [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1184.550170] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] metadata = IMAGE_API.get(context, image_ref) [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return session.show(context, image_id, [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] _reraise_translated_image_exception(image_id) [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] raise new_exc.with_traceback(exc_trace) [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] result = getattr(controller, method)(*args, **kwargs) [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return self._get(image_id) [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] resp, body = self.http_client.get(url, headers=header) [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return self.request(url, 'GET', **kwargs) [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return self._handle_response(resp) [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] raise exc.from_response(resp, resp.content) [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1184.551416] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1184.551416] env[60722]: INFO nova.compute.manager [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Terminating instance [ 1184.552556] env[60722]: DEBUG nova.compute.manager [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1184.552739] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1184.553597] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c806a6f6-ce6e-40b7-aedc-320e2e51072b {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.561110] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1184.561311] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-dd8c68f6-b02d-452f-acfc-e2e4f0336406 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.620537] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1184.620733] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1184.620847] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Deleting the datastore file [datastore1] 2d58b057-fec8-4c3c-bf83-452d27abfd38 {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1184.621108] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9ff1eea8-c7f3-4b0a-a068-e92362c681ca {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.627451] env[60722]: DEBUG oslo_vmware.api [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Waiting for the task: (returnval){ [ 1184.627451] env[60722]: value = "task-565244" [ 1184.627451] env[60722]: _type = "Task" [ 1184.627451] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1184.635665] env[60722]: DEBUG oslo_vmware.api [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Task: {'id': task-565244, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1185.020028] env[60722]: INFO nova.compute.manager [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Rebuilding instance [ 1185.050455] env[60722]: DEBUG nova.objects.instance [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Lazy-loading 'trusted_certs' on Instance uuid 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa {{(pid=60722) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 1185.061425] env[60722]: DEBUG nova.compute.manager [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Checking state {{(pid=60722) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 1185.062423] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2b740ef-fad3-4a16-96d3-88202b176cc4 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1185.096616] env[60722]: DEBUG nova.objects.instance [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Lazy-loading 'pci_requests' on Instance uuid 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa {{(pid=60722) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 1185.103910] env[60722]: DEBUG nova.objects.instance [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Lazy-loading 'pci_devices' on Instance uuid 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa {{(pid=60722) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 1185.110648] env[60722]: DEBUG nova.objects.instance [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Lazy-loading 'resources' on Instance uuid 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa {{(pid=60722) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 1185.116299] env[60722]: DEBUG nova.objects.instance [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Lazy-loading 'migration_context' on Instance uuid 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa {{(pid=60722) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 1185.122857] env[60722]: DEBUG nova.objects.instance [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Trying to apply a migration context that does not seem to be set for this instance {{(pid=60722) apply_migration_context /opt/stack/nova/nova/objects/instance.py:1032}} [ 1185.123285] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Powering off the VM {{(pid=60722) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 1185.123516] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-0a66dab1-4e7b-4a1a-87fa-2f4dbf4cfd2c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1185.133219] env[60722]: DEBUG oslo_vmware.api [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Waiting for the task: (returnval){ [ 1185.133219] env[60722]: value = "task-565245" [ 1185.133219] env[60722]: _type = "Task" [ 1185.133219] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1185.139381] env[60722]: DEBUG oslo_vmware.api [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Task: {'id': task-565244, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.090953} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1185.140125] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1185.140432] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1185.140710] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1185.140989] env[60722]: INFO nova.compute.manager [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1185.146970] env[60722]: DEBUG oslo_vmware.api [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565245, 'name': PowerOffVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1185.147475] env[60722]: DEBUG nova.compute.claims [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1185.147667] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1185.147901] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1185.174480] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1185.175374] env[60722]: DEBUG nova.compute.utils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Instance 2d58b057-fec8-4c3c-bf83-452d27abfd38 could not be found. {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1185.177348] env[60722]: DEBUG nova.compute.manager [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Instance disappeared during build. {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1185.177441] env[60722]: DEBUG nova.compute.manager [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1185.177626] env[60722]: DEBUG nova.compute.manager [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1185.177862] env[60722]: DEBUG nova.compute.manager [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1185.178137] env[60722]: DEBUG nova.network.neutron [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1185.274795] env[60722]: DEBUG neutronclient.v2_0.client [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60722) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1185.276366] env[60722]: ERROR nova.compute.manager [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Traceback (most recent call last): [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] result = getattr(controller, method)(*args, **kwargs) [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return self._get(image_id) [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] resp, body = self.http_client.get(url, headers=header) [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return self.request(url, 'GET', **kwargs) [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return self._handle_response(resp) [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] raise exc.from_response(resp, resp.content) [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] During handling of the above exception, another exception occurred: [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Traceback (most recent call last): [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] self.driver.spawn(context, instance, image_meta, [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] self._fetch_image_if_missing(context, vi) [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] image_fetch(context, vi, tmp_image_ds_loc) [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] images.fetch_image( [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] metadata = IMAGE_API.get(context, image_ref) [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return session.show(context, image_id, [ 1185.276366] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] _reraise_translated_image_exception(image_id) [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] raise new_exc.with_traceback(exc_trace) [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] result = getattr(controller, method)(*args, **kwargs) [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return self._get(image_id) [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] resp, body = self.http_client.get(url, headers=header) [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return self.request(url, 'GET', **kwargs) [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return self._handle_response(resp) [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] raise exc.from_response(resp, resp.content) [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] nova.exception.ImageNotAuthorized: Not authorized for image 125a38d9-0f4e-49a0-83bc-e50e222251c8. [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] During handling of the above exception, another exception occurred: [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Traceback (most recent call last): [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] self._build_and_run_instance(context, instance, image, [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] with excutils.save_and_reraise_exception(): [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] self.force_reraise() [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] raise self.value [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] with self.rt.instance_claim(context, instance, node, allocs, [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] self.abort() [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/compute/claims.py", line 85, in abort [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] self.tracker.abort_instance_claim(self.context, self.instance, [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1185.277479] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return f(*args, **kwargs) [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] self._unset_instance_host_and_node(instance) [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] instance.save() [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] updates, result = self.indirection_api.object_action( [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return cctxt.call(context, 'object_action', objinst=objinst, [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] result = self.transport._send( [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return self._driver.send(target, ctxt, message, [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] raise result [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] nova.exception_Remote.InstanceNotFound_Remote: Instance 2d58b057-fec8-4c3c-bf83-452d27abfd38 could not be found. [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Traceback (most recent call last): [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return getattr(target, method)(*args, **kwargs) [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return fn(self, *args, **kwargs) [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] old_ref, inst_ref = db.instance_update_and_get_original( [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return f(*args, **kwargs) [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] with excutils.save_and_reraise_exception() as ectxt: [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] self.force_reraise() [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] raise self.value [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return f(*args, **kwargs) [ 1185.279528] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return f(context, *args, **kwargs) [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] raise exception.InstanceNotFound(instance_id=uuid) [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] nova.exception.InstanceNotFound: Instance 2d58b057-fec8-4c3c-bf83-452d27abfd38 could not be found. [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] During handling of the above exception, another exception occurred: [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Traceback (most recent call last): [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] ret = obj(*args, **kwargs) [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] exception_handler_v20(status_code, error_body) [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] raise client_exc(message=error_message, [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Neutron server returns request_ids: ['req-a365fb18-bd24-4aa1-837b-15cb383e931d'] [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] During handling of the above exception, another exception occurred: [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] Traceback (most recent call last): [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] self._deallocate_network(context, instance, requested_networks) [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] self.network_api.deallocate_for_instance( [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/network/neutron.py", line 1805, in deallocate_for_instance [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] data = neutron.list_ports(**search_opts) [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] ret = obj(*args, **kwargs) [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return self.list('ports', self.ports_path, retrieve_all, [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] ret = obj(*args, **kwargs) [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1185.280782] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] for r in self._pagination(collection, path, **params): [ 1185.282877] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1185.282877] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] res = self.get(path, params=params) [ 1185.282877] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1185.282877] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] ret = obj(*args, **kwargs) [ 1185.282877] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1185.282877] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return self.retry_request("GET", action, body=body, [ 1185.282877] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1185.282877] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] ret = obj(*args, **kwargs) [ 1185.282877] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1185.282877] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] return self.do_request(method, action, body=body, [ 1185.282877] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1185.282877] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] ret = obj(*args, **kwargs) [ 1185.282877] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1185.282877] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] self._handle_fault_response(status_code, replybody, resp) [ 1185.282877] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1185.282877] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] raise exception.Unauthorized() [ 1185.282877] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] nova.exception.Unauthorized: Not authorized. [ 1185.282877] env[60722]: ERROR nova.compute.manager [instance: 2d58b057-fec8-4c3c-bf83-452d27abfd38] [ 1185.300708] env[60722]: DEBUG oslo_concurrency.lockutils [None req-e9773fe3-ab9f-4072-b711-cb4c672e512f tempest-AttachInterfacesTestJSON-1587176260 tempest-AttachInterfacesTestJSON-1587176260-project-member] Lock "2d58b057-fec8-4c3c-bf83-452d27abfd38" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 253.613s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1185.642522] env[60722]: DEBUG oslo_vmware.api [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565245, 'name': PowerOffVM_Task, 'duration_secs': 0.188431} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1185.642894] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Powered off the VM {{(pid=60722) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 1185.643486] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Powering off the VM {{(pid=60722) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 1185.643713] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-a4494e09-0859-4f92-ab51-1db6407ab147 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1185.649751] env[60722]: DEBUG oslo_vmware.api [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Waiting for the task: (returnval){ [ 1185.649751] env[60722]: value = "task-565246" [ 1185.649751] env[60722]: _type = "Task" [ 1185.649751] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1185.656994] env[60722]: DEBUG oslo_vmware.api [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565246, 'name': PowerOffVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1186.160256] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] VM already powered off {{(pid=60722) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1509}} [ 1186.160446] env[60722]: DEBUG nova.virt.vmwareapi.volumeops [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Volume detach. Driver type: vmdk {{(pid=60722) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 1186.160632] env[60722]: DEBUG nova.virt.vmwareapi.volumeops [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-141664', 'volume_id': 'cdb00472-d087-470b-bc90-3c9a91203f67', 'name': 'volume-cdb00472-d087-470b-bc90-3c9a91203f67', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa', 'attached_at': '', 'detached_at': '', 'volume_id': 'cdb00472-d087-470b-bc90-3c9a91203f67', 'serial': 'cdb00472-d087-470b-bc90-3c9a91203f67'} {{(pid=60722) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 1186.161404] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b2b3f38-f749-449d-8dff-9277fe1bb63e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1186.178574] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-977a1acc-5412-4979-89d6-d37822251308 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1186.184636] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6d9d293-52e1-4503-ba89-1b1e2b35f0d0 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1186.201119] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ba14f7e-bd87-4179-ad4f-625f22e08787 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1186.216321] env[60722]: DEBUG nova.virt.vmwareapi.volumeops [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] The volume has not been displaced from its original location: [datastore1] volume-cdb00472-d087-470b-bc90-3c9a91203f67/volume-cdb00472-d087-470b-bc90-3c9a91203f67.vmdk. No consolidation needed. {{(pid=60722) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 1186.221327] env[60722]: DEBUG nova.virt.vmwareapi.volumeops [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Reconfiguring VM instance instance-0000001d to detach disk 2000 {{(pid=60722) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 1186.221564] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-1d1154bc-c5ad-4954-bdcb-84bd9769cded {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1186.239029] env[60722]: DEBUG oslo_vmware.api [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Waiting for the task: (returnval){ [ 1186.239029] env[60722]: value = "task-565247" [ 1186.239029] env[60722]: _type = "Task" [ 1186.239029] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1186.246200] env[60722]: DEBUG oslo_vmware.api [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565247, 'name': ReconfigVM_Task} progress is 5%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1186.749125] env[60722]: DEBUG oslo_vmware.api [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565247, 'name': ReconfigVM_Task, 'duration_secs': 0.17385} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1186.749561] env[60722]: DEBUG nova.virt.vmwareapi.volumeops [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Reconfigured VM instance instance-0000001d to detach disk 2000 {{(pid=60722) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 1186.753887] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-1f5c9781-043a-48da-b4da-17bbf52041dc {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1186.768010] env[60722]: DEBUG oslo_vmware.api [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Waiting for the task: (returnval){ [ 1186.768010] env[60722]: value = "task-565248" [ 1186.768010] env[60722]: _type = "Task" [ 1186.768010] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1186.775251] env[60722]: DEBUG oslo_vmware.api [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565248, 'name': ReconfigVM_Task} progress is 5%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1187.277221] env[60722]: DEBUG oslo_vmware.api [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565248, 'name': ReconfigVM_Task, 'duration_secs': 0.190826} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1187.277509] env[60722]: DEBUG nova.virt.vmwareapi.volumeops [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-141664', 'volume_id': 'cdb00472-d087-470b-bc90-3c9a91203f67', 'name': 'volume-cdb00472-d087-470b-bc90-3c9a91203f67', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa', 'attached_at': '', 'detached_at': '', 'volume_id': 'cdb00472-d087-470b-bc90-3c9a91203f67', 'serial': 'cdb00472-d087-470b-bc90-3c9a91203f67'} {{(pid=60722) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 1187.277728] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1187.278435] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07737287-338e-4d22-863d-3c9a0939b27a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1187.284417] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1187.284604] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4bb2e74f-8160-4078-b728-2172a2d6ba8d {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1187.409452] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1187.409620] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1187.409797] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Deleting the datastore file [datastore1] 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1187.410052] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-536f5f9e-468a-419c-bb38-307b04c67d58 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1187.416375] env[60722]: DEBUG oslo_vmware.api [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Waiting for the task: (returnval){ [ 1187.416375] env[60722]: value = "task-565250" [ 1187.416375] env[60722]: _type = "Task" [ 1187.416375] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1187.423791] env[60722]: DEBUG oslo_vmware.api [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565250, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1187.926669] env[60722]: DEBUG oslo_vmware.api [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Task: {'id': task-565250, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.123232} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1187.927050] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1187.927117] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1187.927232] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1187.974779] env[60722]: DEBUG nova.virt.vmwareapi.volumeops [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Volume detach. Driver type: vmdk {{(pid=60722) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 1187.975116] env[60722]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-79bffc21-e9ab-4ca7-ba44-efe7965cd653 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1187.983560] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81e11a0f-b672-4c44-8a66-99ac3e7a91b2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.005364] env[60722]: ERROR nova.compute.manager [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Failed to detach volume cdb00472-d087-470b-bc90-3c9a91203f67 from /dev/sda: nova.exception.InstanceNotFound: Instance 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa could not be found. [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Traceback (most recent call last): [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/compute/manager.py", line 4100, in _do_rebuild_instance [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] self.driver.rebuild(**kwargs) [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/virt/driver.py", line 378, in rebuild [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] raise NotImplementedError() [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] NotImplementedError [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] During handling of the above exception, another exception occurred: [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Traceback (most recent call last): [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/compute/manager.py", line 3535, in _detach_root_volume [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] self.driver.detach_volume(context, old_connection_info, [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 542, in detach_volume [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] return self._volumeops.detach_volume(connection_info, instance) [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] self._detach_volume_vmdk(connection_info, instance) [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] stable_ref.fetch_moref(session) [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] raise exception.InstanceNotFound(instance_id=self._uuid) [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] nova.exception.InstanceNotFound: Instance 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa could not be found. [ 1188.005364] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] [ 1188.125642] env[60722]: DEBUG nova.compute.utils [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Build of instance 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa aborted: Failed to rebuild volume backed instance. {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1188.128010] env[60722]: ERROR nova.compute.manager [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Setting instance vm_state to ERROR: nova.exception.BuildAbortException: Build of instance 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa aborted: Failed to rebuild volume backed instance. [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Traceback (most recent call last): [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/compute/manager.py", line 4100, in _do_rebuild_instance [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] self.driver.rebuild(**kwargs) [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/virt/driver.py", line 378, in rebuild [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] raise NotImplementedError() [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] NotImplementedError [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] During handling of the above exception, another exception occurred: [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Traceback (most recent call last): [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/compute/manager.py", line 3570, in _rebuild_volume_backed_instance [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] self._detach_root_volume(context, instance, root_bdm) [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/compute/manager.py", line 3549, in _detach_root_volume [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] with excutils.save_and_reraise_exception(): [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] self.force_reraise() [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] raise self.value [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/compute/manager.py", line 3535, in _detach_root_volume [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] self.driver.detach_volume(context, old_connection_info, [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 542, in detach_volume [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] return self._volumeops.detach_volume(connection_info, instance) [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] self._detach_volume_vmdk(connection_info, instance) [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] stable_ref.fetch_moref(session) [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] raise exception.InstanceNotFound(instance_id=self._uuid) [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] nova.exception.InstanceNotFound: Instance 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa could not be found. [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] During handling of the above exception, another exception occurred: [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Traceback (most recent call last): [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/compute/manager.py", line 10732, in _error_out_instance_on_exception [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] yield [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/compute/manager.py", line 3826, in rebuild_instance [ 1188.128010] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] self._do_rebuild_instance_with_claim( [ 1188.129409] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/compute/manager.py", line 3912, in _do_rebuild_instance_with_claim [ 1188.129409] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] self._do_rebuild_instance( [ 1188.129409] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/compute/manager.py", line 4104, in _do_rebuild_instance [ 1188.129409] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] self._rebuild_default_impl(**kwargs) [ 1188.129409] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/compute/manager.py", line 3693, in _rebuild_default_impl [ 1188.129409] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] self._rebuild_volume_backed_instance( [ 1188.129409] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] File "/opt/stack/nova/nova/compute/manager.py", line 3585, in _rebuild_volume_backed_instance [ 1188.129409] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] raise exception.BuildAbortException( [ 1188.129409] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] nova.exception.BuildAbortException: Build of instance 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa aborted: Failed to rebuild volume backed instance. [ 1188.129409] env[60722]: ERROR nova.compute.manager [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] [ 1188.207689] env[60722]: DEBUG oslo_concurrency.lockutils [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1188.207888] env[60722]: DEBUG oslo_concurrency.lockutils [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1188.220862] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0f8a8eb-6a29-4b5b-887b-96448752bb56 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.228381] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3748c0df-0599-4f14-9ec3-e8764f6ee22e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.258427] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a84d8222-6760-4aee-a014-756322e0d4ef {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.265087] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5bd55cc-4210-4147-bc12-9365cd87a1cc {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.277509] env[60722]: DEBUG nova.compute.provider_tree [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1188.285302] env[60722]: DEBUG nova.scheduler.client.report [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1188.299255] env[60722]: DEBUG oslo_concurrency.lockutils [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1188.299432] env[60722]: INFO nova.compute.manager [None req-22149bf4-d573-42f2-bddc-0bd3fb29e8e8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Successfully reverted task state from rebuilding on failure for instance. [ 1188.658622] env[60722]: DEBUG oslo_concurrency.lockutils [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Acquiring lock "4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1188.658850] env[60722]: DEBUG oslo_concurrency.lockutils [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Lock "4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1188.659056] env[60722]: DEBUG oslo_concurrency.lockutils [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Acquiring lock "4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1188.659236] env[60722]: DEBUG oslo_concurrency.lockutils [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Lock "4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1188.659454] env[60722]: DEBUG oslo_concurrency.lockutils [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Lock "4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1188.661536] env[60722]: INFO nova.compute.manager [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Terminating instance [ 1188.663289] env[60722]: DEBUG nova.compute.manager [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1188.663743] env[60722]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-41f60d7b-0e66-4261-872a-c019d86b036b {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.673978] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cf5619e-9fed-4659-9605-9c282073144c {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.695289] env[60722]: WARNING nova.virt.vmwareapi.driver [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Instance does not exists. Proceeding to delete instance properties on datastore: nova.exception.InstanceNotFound: Instance 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa could not be found. [ 1188.695450] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1188.695692] env[60722]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-25f53e1d-57f0-4ac7-b266-92c5befdac60 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.703226] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d06882d-53d9-468d-9018-045a03208205 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1188.723123] env[60722]: WARNING nova.virt.vmwareapi.vmops [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa could not be found. [ 1188.723321] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1188.723490] env[60722]: INFO nova.compute.manager [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Took 0.06 seconds to destroy the instance on the hypervisor. [ 1188.723714] env[60722]: DEBUG oslo.service.loopingcall [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1188.723894] env[60722]: DEBUG nova.compute.manager [-] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1188.724015] env[60722]: DEBUG nova.network.neutron [-] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1189.182746] env[60722]: DEBUG nova.network.neutron [-] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1189.194766] env[60722]: INFO nova.compute.manager [-] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Took 0.47 seconds to deallocate network for instance. [ 1189.199212] env[60722]: DEBUG nova.compute.manager [req-6816f421-d288-4313-b725-04fe86dceedf req-e46fa740-12f4-4be0-88c1-521a629c8801 service nova] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Received event network-vif-deleted-dba6226f-09ef-4871-9f85-78b3464b9af5 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1189.199397] env[60722]: INFO nova.compute.manager [req-6816f421-d288-4313-b725-04fe86dceedf req-e46fa740-12f4-4be0-88c1-521a629c8801 service nova] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Neutron deleted interface dba6226f-09ef-4871-9f85-78b3464b9af5; detaching it from the instance and deleting it from the info cache [ 1189.199556] env[60722]: DEBUG nova.network.neutron [req-6816f421-d288-4313-b725-04fe86dceedf req-e46fa740-12f4-4be0-88c1-521a629c8801 service nova] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1189.207899] env[60722]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5f1b6333-b809-4212-af5a-416da36d8ac1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1189.217933] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68e35b2d-7757-4cdc-b583-02a18e017a26 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1189.241770] env[60722]: DEBUG nova.compute.manager [req-6816f421-d288-4313-b725-04fe86dceedf req-e46fa740-12f4-4be0-88c1-521a629c8801 service nova] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Detach interface failed, port_id=dba6226f-09ef-4871-9f85-78b3464b9af5, reason: Instance 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa could not be found. {{(pid=60722) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10832}} [ 1189.265908] env[60722]: INFO nova.compute.manager [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Took 0.07 seconds to detach 1 volumes for instance. [ 1189.268079] env[60722]: DEBUG nova.compute.manager [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] [instance: 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa] Deleting volume: cdb00472-d087-470b-bc90-3c9a91203f67 {{(pid=60722) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3217}} [ 1189.332567] env[60722]: DEBUG oslo_concurrency.lockutils [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1189.332835] env[60722]: DEBUG oslo_concurrency.lockutils [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1189.333222] env[60722]: DEBUG nova.objects.instance [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Lazy-loading 'resources' on Instance uuid 4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa {{(pid=60722) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 1189.358691] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09a6ec95-3fc8-4566-9023-0311a5269389 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1189.367044] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6598bca-8d79-4d41-a00f-8cf77a19eb11 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1189.399040] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe32306c-70b2-4ff5-b026-6d74995b8877 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1189.406389] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf2e8774-31bf-4d91-a32b-24417a908a81 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1189.420669] env[60722]: DEBUG nova.compute.provider_tree [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1189.429337] env[60722]: DEBUG nova.scheduler.client.report [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1189.442761] env[60722]: DEBUG oslo_concurrency.lockutils [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1189.504217] env[60722]: DEBUG oslo_concurrency.lockutils [None req-71ae09e1-8f32-461c-bdc2-17b78d3562a8 tempest-ServerActionsV293TestJSON-706775157 tempest-ServerActionsV293TestJSON-706775157-project-member] Lock "4d5d77e3-c1c1-432f-83dd-ef33c1f8d9fa" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.844s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1192.118469] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1192.126829] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Getting list of instances from cluster (obj){ [ 1192.126829] env[60722]: value = "domain-c8" [ 1192.126829] env[60722]: _type = "ClusterComputeResource" [ 1192.126829] env[60722]: } {{(pid=60722) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1192.127754] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16e8c781-5c94-4b41-b4c2-6869a50a906e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.137035] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Got total of 0 instances {{(pid=60722) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1194.720637] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Acquiring lock "c8db5dfd-f361-4282-82a8-83552034e319" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1194.721042] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Lock "c8db5dfd-f361-4282-82a8-83552034e319" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1194.735766] env[60722]: DEBUG nova.compute.manager [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Starting instance... {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1194.778904] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1194.779092] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1194.780458] env[60722]: INFO nova.compute.claims [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1194.851272] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30453ea9-783b-4aa2-bd42-b53ba08f6351 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1194.859598] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85cc4ae1-ff0f-4405-9b8f-4eff2b475d78 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1194.893090] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f06fd60c-43e7-4391-99b8-5576f4069245 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1194.901375] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28c6e8a2-2e36-4e99-87c1-54f18f5a1da5 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1194.914903] env[60722]: DEBUG nova.compute.provider_tree [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1194.923515] env[60722]: DEBUG nova.scheduler.client.report [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1194.935791] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.157s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1194.936240] env[60722]: DEBUG nova.compute.manager [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Start building networks asynchronously for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1194.966521] env[60722]: DEBUG nova.compute.utils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Using /dev/sd instead of None {{(pid=60722) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1194.967674] env[60722]: DEBUG nova.compute.manager [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Allocating IP information in the background. {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1194.967836] env[60722]: DEBUG nova.network.neutron [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] allocate_for_instance() {{(pid=60722) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1194.980914] env[60722]: DEBUG nova.compute.manager [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Start building block device mappings for instance. {{(pid=60722) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1195.037400] env[60722]: DEBUG nova.policy [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bdcaaf4b82e041639acef9f0a22c24fe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4d41c71183da43dc845e99287fdc0cc6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60722) authorize /opt/stack/nova/nova/policy.py:203}} [ 1195.052142] env[60722]: DEBUG nova.compute.manager [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Start spawning the instance on the hypervisor. {{(pid=60722) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1195.073138] env[60722]: DEBUG nova.virt.hardware [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-01T06:35:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-01T06:35:21Z,direct_url=,disk_format='vmdk',id=125a38d9-0f4e-49a0-83bc-e50e222251c8,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='0b91883855c8437587c531188adfc164',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-01T06:35:22Z,virtual_size=,visibility=), allow threads: False {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1195.073387] env[60722]: DEBUG nova.virt.hardware [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Flavor limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1195.073538] env[60722]: DEBUG nova.virt.hardware [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Image limits 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1195.073713] env[60722]: DEBUG nova.virt.hardware [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Flavor pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1195.073853] env[60722]: DEBUG nova.virt.hardware [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Image pref 0:0:0 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1195.073994] env[60722]: DEBUG nova.virt.hardware [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60722) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1195.074206] env[60722]: DEBUG nova.virt.hardware [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1195.074355] env[60722]: DEBUG nova.virt.hardware [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1195.074511] env[60722]: DEBUG nova.virt.hardware [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Got 1 possible topologies {{(pid=60722) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1195.074679] env[60722]: DEBUG nova.virt.hardware [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1195.074819] env[60722]: DEBUG nova.virt.hardware [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60722) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1195.075866] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa5ded74-898c-4500-8769-56faa2448f6f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.083989] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35a97261-a68d-4a26-abe9-a0600ea02de1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1195.318665] env[60722]: DEBUG nova.network.neutron [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Successfully created port: cfb064b9-f5fa-464f-a9cc-952a0ff349b9 {{(pid=60722) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1195.837385] env[60722]: DEBUG nova.compute.manager [req-8bac11b1-42e9-47a0-8b43-d6601ad65336 req-39009353-cdfe-4e23-836a-f7ac16e63a71 service nova] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Received event network-vif-plugged-cfb064b9-f5fa-464f-a9cc-952a0ff349b9 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1195.838035] env[60722]: DEBUG oslo_concurrency.lockutils [req-8bac11b1-42e9-47a0-8b43-d6601ad65336 req-39009353-cdfe-4e23-836a-f7ac16e63a71 service nova] Acquiring lock "c8db5dfd-f361-4282-82a8-83552034e319-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1195.838035] env[60722]: DEBUG oslo_concurrency.lockutils [req-8bac11b1-42e9-47a0-8b43-d6601ad65336 req-39009353-cdfe-4e23-836a-f7ac16e63a71 service nova] Lock "c8db5dfd-f361-4282-82a8-83552034e319-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1195.838035] env[60722]: DEBUG oslo_concurrency.lockutils [req-8bac11b1-42e9-47a0-8b43-d6601ad65336 req-39009353-cdfe-4e23-836a-f7ac16e63a71 service nova] Lock "c8db5dfd-f361-4282-82a8-83552034e319-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1195.838260] env[60722]: DEBUG nova.compute.manager [req-8bac11b1-42e9-47a0-8b43-d6601ad65336 req-39009353-cdfe-4e23-836a-f7ac16e63a71 service nova] [instance: c8db5dfd-f361-4282-82a8-83552034e319] No waiting events found dispatching network-vif-plugged-cfb064b9-f5fa-464f-a9cc-952a0ff349b9 {{(pid=60722) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1195.838260] env[60722]: WARNING nova.compute.manager [req-8bac11b1-42e9-47a0-8b43-d6601ad65336 req-39009353-cdfe-4e23-836a-f7ac16e63a71 service nova] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Received unexpected event network-vif-plugged-cfb064b9-f5fa-464f-a9cc-952a0ff349b9 for instance with vm_state building and task_state spawning. [ 1195.913040] env[60722]: DEBUG nova.network.neutron [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Successfully updated port: cfb064b9-f5fa-464f-a9cc-952a0ff349b9 {{(pid=60722) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1195.920778] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Acquiring lock "refresh_cache-c8db5dfd-f361-4282-82a8-83552034e319" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1195.920887] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Acquired lock "refresh_cache-c8db5dfd-f361-4282-82a8-83552034e319" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1195.920954] env[60722]: DEBUG nova.network.neutron [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Building network info cache for instance {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2009}} [ 1195.958506] env[60722]: DEBUG nova.network.neutron [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Instance cache missing network info. {{(pid=60722) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3322}} [ 1196.100218] env[60722]: DEBUG nova.network.neutron [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Updating instance_info_cache with network_info: [{"id": "cfb064b9-f5fa-464f-a9cc-952a0ff349b9", "address": "fa:16:3e:56:c7:fd", "network": {"id": "60090b59-ae87-4251-9a2a-d20f93368465", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-703026459-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4d41c71183da43dc845e99287fdc0cc6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d1da5fc2-0280-4f76-ac97-20ea4bc7bb16", "external-id": "nsx-vlan-transportzone-563", "segmentation_id": 563, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcfb064b9-f5", "ovs_interfaceid": "cfb064b9-f5fa-464f-a9cc-952a0ff349b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1196.110094] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Releasing lock "refresh_cache-c8db5dfd-f361-4282-82a8-83552034e319" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1196.110355] env[60722]: DEBUG nova.compute.manager [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Instance network_info: |[{"id": "cfb064b9-f5fa-464f-a9cc-952a0ff349b9", "address": "fa:16:3e:56:c7:fd", "network": {"id": "60090b59-ae87-4251-9a2a-d20f93368465", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-703026459-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4d41c71183da43dc845e99287fdc0cc6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d1da5fc2-0280-4f76-ac97-20ea4bc7bb16", "external-id": "nsx-vlan-transportzone-563", "segmentation_id": 563, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcfb064b9-f5", "ovs_interfaceid": "cfb064b9-f5fa-464f-a9cc-952a0ff349b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60722) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1196.110688] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:56:c7:fd', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd1da5fc2-0280-4f76-ac97-20ea4bc7bb16', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cfb064b9-f5fa-464f-a9cc-952a0ff349b9', 'vif_model': 'vmxnet3'}] {{(pid=60722) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1196.117883] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Creating folder: Project (4d41c71183da43dc845e99287fdc0cc6). Parent ref: group-v141606. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1196.118320] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0db9ec01-a956-4611-b064-b751e3054952 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1196.130322] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Created folder: Project (4d41c71183da43dc845e99287fdc0cc6) in parent group-v141606. [ 1196.130495] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Creating folder: Instances. Parent ref: group-v141667. {{(pid=60722) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1196.130696] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9b938431-9c3c-4537-b305-342e10abea6f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1196.140221] env[60722]: INFO nova.virt.vmwareapi.vm_util [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Created folder: Instances in parent group-v141667. [ 1196.140476] env[60722]: DEBUG oslo.service.loopingcall [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60722) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1196.140639] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Creating VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1196.140871] env[60722]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c5907b64-d904-4542-8f3e-91be17e24924 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1196.159283] env[60722]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1196.159283] env[60722]: value = "task-565254" [ 1196.159283] env[60722]: _type = "Task" [ 1196.159283] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1196.166765] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565254, 'name': CreateVM_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1196.669055] env[60722]: DEBUG oslo_vmware.api [-] Task: {'id': task-565254, 'name': CreateVM_Task, 'duration_secs': 0.300584} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1196.669249] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Created VM on the ESX host {{(pid=60722) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1196.675735] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1196.675887] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1196.676205] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1196.676429] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-34a01285-4aee-453a-b7ba-3559a41b22e0 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1196.680898] env[60722]: DEBUG oslo_vmware.api [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Waiting for the task: (returnval){ [ 1196.680898] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]52c90ec9-d2a7-f749-68a3-df75814ff2c6" [ 1196.680898] env[60722]: _type = "Task" [ 1196.680898] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1196.688063] env[60722]: DEBUG oslo_vmware.api [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]52c90ec9-d2a7-f749-68a3-df75814ff2c6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1197.191552] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1197.191916] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Processing image 125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1197.191989] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1197.192147] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Acquired lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1197.192318] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1197.192543] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-05ff27fd-c3bf-4b4c-842f-8471d0c66b0f {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.209020] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1197.209196] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60722) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1197.209854] env[60722]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f9801eb5-f0f5-461b-9653-dc2773be6eab {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.214922] env[60722]: DEBUG oslo_vmware.api [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Waiting for the task: (returnval){ [ 1197.214922] env[60722]: value = "session[52424491-492b-e038-d4a9-f02f8dc1ea37]5271d5b4-3a89-61ba-8615-b4c3e2296f37" [ 1197.214922] env[60722]: _type = "Task" [ 1197.214922] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1197.222345] env[60722]: DEBUG oslo_vmware.api [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Task: {'id': session[52424491-492b-e038-d4a9-f02f8dc1ea37]5271d5b4-3a89-61ba-8615-b4c3e2296f37, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1197.725210] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Preparing fetch location {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1197.725435] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Creating directory with path [datastore1] vmware_temp/2dd79523-13c3-4250-ae04-231a69f509c3/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1197.725654] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d8ab4549-99e8-436d-9241-d463d5f249c2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.744384] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Created directory with path [datastore1] vmware_temp/2dd79523-13c3-4250-ae04-231a69f509c3/125a38d9-0f4e-49a0-83bc-e50e222251c8 {{(pid=60722) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1197.744556] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Fetch image to [datastore1] vmware_temp/2dd79523-13c3-4250-ae04-231a69f509c3/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1197.744720] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to [datastore1] vmware_temp/2dd79523-13c3-4250-ae04-231a69f509c3/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1197.745413] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fc9ef41-4601-4a6d-b348-10ff3574d888 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.751645] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a58d20bb-e2b8-4851-8af5-a941c2ff644e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.760150] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53f8ec5a-6f26-4d4c-b2dd-c5f8dfc495d1 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.790699] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59affa73-16c2-4eaf-b5ba-1a068d127f07 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.796407] env[60722]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-347f1006-a6af-4b71-ae4a-8cbe4378c1ba {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1197.816810] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Downloading image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1197.860556] env[60722]: DEBUG oslo_vmware.rw_handles [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2dd79523-13c3-4250-ae04-231a69f509c3/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1197.863733] env[60722]: DEBUG nova.compute.manager [req-d50488a0-4bab-4b9c-a2dd-1023a82db679 req-7edf0439-d6da-4ee8-bec0-b20507c52f0d service nova] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Received event network-changed-cfb064b9-f5fa-464f-a9cc-952a0ff349b9 {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:10998}} [ 1197.863929] env[60722]: DEBUG nova.compute.manager [req-d50488a0-4bab-4b9c-a2dd-1023a82db679 req-7edf0439-d6da-4ee8-bec0-b20507c52f0d service nova] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Refreshing instance network info cache due to event network-changed-cfb064b9-f5fa-464f-a9cc-952a0ff349b9. {{(pid=60722) external_instance_event /opt/stack/nova/nova/compute/manager.py:11003}} [ 1197.864153] env[60722]: DEBUG oslo_concurrency.lockutils [req-d50488a0-4bab-4b9c-a2dd-1023a82db679 req-7edf0439-d6da-4ee8-bec0-b20507c52f0d service nova] Acquiring lock "refresh_cache-c8db5dfd-f361-4282-82a8-83552034e319" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1197.864298] env[60722]: DEBUG oslo_concurrency.lockutils [req-d50488a0-4bab-4b9c-a2dd-1023a82db679 req-7edf0439-d6da-4ee8-bec0-b20507c52f0d service nova] Acquired lock "refresh_cache-c8db5dfd-f361-4282-82a8-83552034e319" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1197.864504] env[60722]: DEBUG nova.network.neutron [req-d50488a0-4bab-4b9c-a2dd-1023a82db679 req-7edf0439-d6da-4ee8-bec0-b20507c52f0d service nova] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Refreshing network info cache for port cfb064b9-f5fa-464f-a9cc-952a0ff349b9 {{(pid=60722) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2006}} [ 1197.921475] env[60722]: DEBUG oslo_vmware.rw_handles [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Completed reading data from the image iterator. {{(pid=60722) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1197.921475] env[60722]: DEBUG oslo_vmware.rw_handles [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2dd79523-13c3-4250-ae04-231a69f509c3/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60722) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1198.124876] env[60722]: DEBUG nova.network.neutron [req-d50488a0-4bab-4b9c-a2dd-1023a82db679 req-7edf0439-d6da-4ee8-bec0-b20507c52f0d service nova] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Updated VIF entry in instance network info cache for port cfb064b9-f5fa-464f-a9cc-952a0ff349b9. {{(pid=60722) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3481}} [ 1198.125273] env[60722]: DEBUG nova.network.neutron [req-d50488a0-4bab-4b9c-a2dd-1023a82db679 req-7edf0439-d6da-4ee8-bec0-b20507c52f0d service nova] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Updating instance_info_cache with network_info: [{"id": "cfb064b9-f5fa-464f-a9cc-952a0ff349b9", "address": "fa:16:3e:56:c7:fd", "network": {"id": "60090b59-ae87-4251-9a2a-d20f93368465", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-703026459-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4d41c71183da43dc845e99287fdc0cc6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d1da5fc2-0280-4f76-ac97-20ea4bc7bb16", "external-id": "nsx-vlan-transportzone-563", "segmentation_id": 563, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcfb064b9-f5", "ovs_interfaceid": "cfb064b9-f5fa-464f-a9cc-952a0ff349b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1198.134082] env[60722]: DEBUG oslo_concurrency.lockutils [req-d50488a0-4bab-4b9c-a2dd-1023a82db679 req-7edf0439-d6da-4ee8-bec0-b20507c52f0d service nova] Releasing lock "refresh_cache-c8db5dfd-f361-4282-82a8-83552034e319" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1200.964022] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1201.944729] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1201.944958] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1201.945132] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60722) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1203.944816] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1203.945290] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1203.955465] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1203.955664] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1203.955806] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1203.955952] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60722) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1203.956993] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5e3ce7e-feac-4cdd-88fb-b8e6b6be8024 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1203.965633] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c5f0150-061f-4170-bfdf-4dc1753c965a {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1203.979156] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c71bd332-5465-4f5d-b814-207bfacbb1da {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1203.985376] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1f38767-c789-4978-a6ff-b15ad14f1e7e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1204.014663] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181710MB free_disk=100GB free_vcpus=48 pci_devices=None {{(pid=60722) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1204.014813] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1204.014980] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1204.052633] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Instance c8db5dfd-f361-4282-82a8-83552034e319 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60722) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1204.052828] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1204.053013] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1204.078082] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63269803-83f8-4f81-9c19-38a8f193ebb0 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1204.085132] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9bfeaa9-b70f-4ffa-bf06-7325c1b5b8c2 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1204.113742] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6979184e-14ca-443a-a241-a315929e2501 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1204.120195] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78594437-b681-4de4-9888-60232172cfa7 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1204.132685] env[60722]: DEBUG nova.compute.provider_tree [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1204.140713] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1204.153143] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60722) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1204.153306] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1205.148020] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1205.148464] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1205.148464] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1209.944582] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1209.944986] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Starting heal instance info cache {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9808}} [ 1209.944986] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Rebuilding the list of instances to heal {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1209.955247] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Skipping network cache update for instance because it is Building. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9821}} [ 1209.955433] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Didn't find any instances for network info cache update. {{(pid=60722) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9894}} [ 1246.159057] env[60722]: WARNING oslo_vmware.rw_handles [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1246.159057] env[60722]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1246.159057] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1246.159057] env[60722]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1246.159057] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1246.159057] env[60722]: ERROR oslo_vmware.rw_handles response.begin() [ 1246.159057] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1246.159057] env[60722]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1246.159057] env[60722]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1246.159057] env[60722]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1246.159057] env[60722]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1246.159057] env[60722]: ERROR oslo_vmware.rw_handles [ 1246.159057] env[60722]: DEBUG nova.virt.vmwareapi.images [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Downloaded image file data 125a38d9-0f4e-49a0-83bc-e50e222251c8 to vmware_temp/2dd79523-13c3-4250-ae04-231a69f509c3/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk on the data store datastore1 {{(pid=60722) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1246.160620] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Caching image {{(pid=60722) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1246.160892] env[60722]: DEBUG nova.virt.vmwareapi.vm_util [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Copying Virtual Disk [datastore1] vmware_temp/2dd79523-13c3-4250-ae04-231a69f509c3/125a38d9-0f4e-49a0-83bc-e50e222251c8/tmp-sparse.vmdk to [datastore1] vmware_temp/2dd79523-13c3-4250-ae04-231a69f509c3/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk {{(pid=60722) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1246.161194] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c8948427-b465-4d1d-9220-2032ffe64445 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1246.168617] env[60722]: DEBUG oslo_vmware.api [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Waiting for the task: (returnval){ [ 1246.168617] env[60722]: value = "task-565255" [ 1246.168617] env[60722]: _type = "Task" [ 1246.168617] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1246.176333] env[60722]: DEBUG oslo_vmware.api [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Task: {'id': task-565255, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1246.679341] env[60722]: DEBUG oslo_vmware.exceptions [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Fault InvalidArgument not matched. {{(pid=60722) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1246.679602] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Releasing lock "[datastore1] devstack-image-cache_base/125a38d9-0f4e-49a0-83bc-e50e222251c8/125a38d9-0f4e-49a0-83bc-e50e222251c8.vmdk" {{(pid=60722) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1246.680109] env[60722]: ERROR nova.compute.manager [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1246.680109] env[60722]: Faults: ['InvalidArgument'] [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] Traceback (most recent call last): [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] yield resources [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] self.driver.spawn(context, instance, image_meta, [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] self._fetch_image_if_missing(context, vi) [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] image_cache(vi, tmp_image_ds_loc) [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] vm_util.copy_virtual_disk( [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] session._wait_for_task(vmdk_copy_task) [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] return self.wait_for_task(task_ref) [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] return evt.wait() [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] result = hub.switch() [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] return self.greenlet.switch() [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] self.f(*self.args, **self.kw) [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] raise exceptions.translate_fault(task_info.error) [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] Faults: ['InvalidArgument'] [ 1246.680109] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] [ 1246.681039] env[60722]: INFO nova.compute.manager [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Terminating instance [ 1246.683229] env[60722]: DEBUG nova.compute.manager [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Start destroying the instance on the hypervisor. {{(pid=60722) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1246.683413] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Destroying instance {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1246.684159] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13f058f2-dd7e-4922-8283-0b2b469dcce0 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1246.690682] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Unregistering the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1246.690881] env[60722]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-177eb9d0-2f96-4c89-92ad-4f6c97639ee9 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1246.752275] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Unregistered the VM {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1246.752478] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Deleting contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1246.752644] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Deleting the datastore file [datastore1] c8db5dfd-f361-4282-82a8-83552034e319 {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1246.752894] env[60722]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5ac153f3-06f9-4bf0-ab91-89a8755a9190 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1246.759206] env[60722]: DEBUG oslo_vmware.api [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Waiting for the task: (returnval){ [ 1246.759206] env[60722]: value = "task-565257" [ 1246.759206] env[60722]: _type = "Task" [ 1246.759206] env[60722]: } to complete. {{(pid=60722) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1246.766565] env[60722]: DEBUG oslo_vmware.api [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Task: {'id': task-565257, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1247.269060] env[60722]: DEBUG oslo_vmware.api [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Task: {'id': task-565257, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068467} completed successfully. {{(pid=60722) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1247.269432] env[60722]: DEBUG nova.virt.vmwareapi.ds_util [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Deleted the datastore file {{(pid=60722) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1247.269473] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Deleted contents of the VM from datastore datastore1 {{(pid=60722) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1247.269637] env[60722]: DEBUG nova.virt.vmwareapi.vmops [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Instance destroyed {{(pid=60722) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1247.269804] env[60722]: INFO nova.compute.manager [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1247.271978] env[60722]: DEBUG nova.compute.claims [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Aborting claim: {{(pid=60722) abort /opt/stack/nova/nova/compute/claims.py:84}} [ 1247.272159] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1247.272356] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1247.334860] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34b2539f-711f-462c-83cb-ca9beea512be {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1247.341676] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc7933ea-c813-4716-b4e4-ec489fac5b8e {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1247.371432] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ceaf6b73-9c2a-4dbb-a44b-8347f88e4a29 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1247.379800] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfd603a6-96fb-4e4c-a2d4-f5f839889523 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1247.398565] env[60722]: DEBUG nova.compute.provider_tree [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1247.406571] env[60722]: DEBUG nova.scheduler.client.report [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1247.418777] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.146s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1247.419286] env[60722]: ERROR nova.compute.manager [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1247.419286] env[60722]: Faults: ['InvalidArgument'] [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] Traceback (most recent call last): [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] self.driver.spawn(context, instance, image_meta, [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] self._fetch_image_if_missing(context, vi) [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] image_cache(vi, tmp_image_ds_loc) [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] vm_util.copy_virtual_disk( [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] session._wait_for_task(vmdk_copy_task) [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] return self.wait_for_task(task_ref) [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] return evt.wait() [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] result = hub.switch() [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] return self.greenlet.switch() [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] self.f(*self.args, **self.kw) [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] raise exceptions.translate_fault(task_info.error) [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] Faults: ['InvalidArgument'] [ 1247.419286] env[60722]: ERROR nova.compute.manager [instance: c8db5dfd-f361-4282-82a8-83552034e319] [ 1247.420067] env[60722]: DEBUG nova.compute.utils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] VimFaultException {{(pid=60722) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1247.421275] env[60722]: DEBUG nova.compute.manager [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Build of instance c8db5dfd-f361-4282-82a8-83552034e319 was re-scheduled: A specified parameter was not correct: fileType [ 1247.421275] env[60722]: Faults: ['InvalidArgument'] {{(pid=60722) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1247.421629] env[60722]: DEBUG nova.compute.manager [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Unplugging VIFs for instance {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1247.421790] env[60722]: DEBUG nova.compute.manager [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60722) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1247.421954] env[60722]: DEBUG nova.compute.manager [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Deallocating network for instance {{(pid=60722) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1247.422126] env[60722]: DEBUG nova.network.neutron [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] deallocate_for_instance() {{(pid=60722) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1802}} [ 1247.680210] env[60722]: DEBUG nova.network.neutron [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Updating instance_info_cache with network_info: [] {{(pid=60722) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1247.691894] env[60722]: INFO nova.compute.manager [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] [instance: c8db5dfd-f361-4282-82a8-83552034e319] Took 0.27 seconds to deallocate network for instance. [ 1247.778030] env[60722]: INFO nova.scheduler.client.report [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Deleted allocations for instance c8db5dfd-f361-4282-82a8-83552034e319 [ 1247.794532] env[60722]: DEBUG oslo_concurrency.lockutils [None req-c1f3d924-8e29-49a5-b2c0-0992959e1017 tempest-ServerTagsTestJSON-1316334765 tempest-ServerTagsTestJSON-1316334765-project-member] Lock "c8db5dfd-f361-4282-82a8-83552034e319" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 53.074s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1262.944805] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1262.945150] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1262.945200] env[60722]: DEBUG nova.compute.manager [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60722) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10427}} [ 1263.944969] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1263.945390] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1263.945390] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1263.955838] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1263.956064] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1263.956213] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1263.956365] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60722) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1263.957410] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44feed19-fe1b-47da-a8e1-f5210c2216ef {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1263.966041] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de4b5a75-2731-4da8-8c2f-f32cd32a7113 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1263.979387] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccb3b079-89e4-4ddc-b743-cdb5943c93a4 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1263.985420] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ebae661-6f21-4312-bd4a-3828c496afed {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1264.614387] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181673MB free_disk=100GB free_vcpus=48 pci_devices=None {{(pid=60722) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1264.614608] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1264.614792] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1264.650319] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1264.650489] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=60722) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1264.664101] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f1c7aeb-6b63-4ec0-b9a2-2c41ab7c8175 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1264.670696] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c685ba1-47b6-4a89-8fb6-120c7f072157 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1264.699393] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51d5db9f-aa97-47e2-b739-c0c78f657506 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1264.706193] env[60722]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33c2bf23-9f1f-4347-8f0d-7f7dca4c4557 {{(pid=60722) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1264.718627] env[60722]: DEBUG nova.compute.provider_tree [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed in ProviderTree for provider: 6d7f336b-9351-4171-8197-866cdafbab42 {{(pid=60722) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1264.726323] env[60722]: DEBUG nova.scheduler.client.report [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Inventory has not changed for provider 6d7f336b-9351-4171-8197-866cdafbab42 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 100, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60722) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1264.738481] env[60722]: DEBUG nova.compute.resource_tracker [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60722) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1264.738644] env[60722]: DEBUG oslo_concurrency.lockutils [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.124s {{(pid=60722) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1265.738243] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1265.738593] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1265.939717] env[60722]: DEBUG oslo_service.periodic_task [None req-ca99632d-8f68-47b2-9b35-8edc5145c513 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60722) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}}