[ 503.818698] env[60498]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 504.462951] env[60548]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 506.028408] env[60548]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=60548) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 506.028408] env[60548]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=60548) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 506.028804] env[60548]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=60548) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 506.028903] env[60548]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 506.030407] env[60548]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 506.151861] env[60548]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=60548) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 506.161547] env[60548]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=60548) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 506.268928] env[60548]: INFO nova.virt.driver [None req-0bfbdb3a-bfea-4faa-b40a-27e1f7c6f75c None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 506.348439] env[60548]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 506.348601] env[60548]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 506.348710] env[60548]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=60548) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 509.680883] env[60548]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-775a9e9e-6b68-4a97-80c5-27e6fee573d8 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 509.696821] env[60548]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=60548) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 509.697051] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-4b1f014c-a7ef-4990-b9d9-62632c335e94 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 509.725438] env[60548]: INFO oslo_vmware.api [-] Successfully established new session; session ID is ddf03. [ 509.725613] env[60548]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.377s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 509.726331] env[60548]: INFO nova.virt.vmwareapi.driver [None req-0bfbdb3a-bfea-4faa-b40a-27e1f7c6f75c None None] VMware vCenter version: 7.0.3 [ 509.729811] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-296bbfef-e0e4-4f2e-9376-63cc5fc0f220 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 509.748416] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67d85df6-5678-44bc-bae0-8c7e6e0157c6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 509.755889] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14a51d26-4b89-4563-8668-3c47482f619b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 509.763349] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb4b688b-d2ba-41cf-a341-7cbef69c9f5c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 509.779219] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce2717df-dd3b-4f3e-9a97-1a43b812af1b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 509.787614] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4853b2c4-2169-42a3-b426-f3e809fffd19 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 509.819115] env[60548]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-c5f225d5-95ef-404b-8969-0909206aa050 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 509.825083] env[60548]: DEBUG nova.virt.vmwareapi.driver [None req-0bfbdb3a-bfea-4faa-b40a-27e1f7c6f75c None None] Extension org.openstack.compute already exists. {{(pid=60548) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 509.827750] env[60548]: INFO nova.compute.provider_config [None req-0bfbdb3a-bfea-4faa-b40a-27e1f7c6f75c None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 509.845147] env[60548]: DEBUG nova.context [None req-0bfbdb3a-bfea-4faa-b40a-27e1f7c6f75c None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),09b9d109-a207-40de-96b0-5d124568ab99(cell1) {{(pid=60548) load_cells /opt/stack/nova/nova/context.py:464}} [ 509.847160] env[60548]: DEBUG oslo_concurrency.lockutils [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 509.847383] env[60548]: DEBUG oslo_concurrency.lockutils [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 509.848131] env[60548]: DEBUG oslo_concurrency.lockutils [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 509.848479] env[60548]: DEBUG oslo_concurrency.lockutils [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] Acquiring lock "09b9d109-a207-40de-96b0-5d124568ab99" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 509.848661] env[60548]: DEBUG oslo_concurrency.lockutils [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] Lock "09b9d109-a207-40de-96b0-5d124568ab99" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 509.849655] env[60548]: DEBUG oslo_concurrency.lockutils [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] Lock "09b9d109-a207-40de-96b0-5d124568ab99" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 509.862703] env[60548]: DEBUG oslo_db.sqlalchemy.engines [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=60548) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 509.863145] env[60548]: DEBUG oslo_db.sqlalchemy.engines [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=60548) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 509.869818] env[60548]: ERROR nova.db.main.api [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 509.869818] env[60548]: result = function(*args, **kwargs) [ 509.869818] env[60548]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 509.869818] env[60548]: return func(*args, **kwargs) [ 509.869818] env[60548]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 509.869818] env[60548]: result = fn(*args, **kwargs) [ 509.869818] env[60548]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 509.869818] env[60548]: return f(*args, **kwargs) [ 509.869818] env[60548]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 509.869818] env[60548]: return db.service_get_minimum_version(context, binaries) [ 509.869818] env[60548]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 509.869818] env[60548]: _check_db_access() [ 509.869818] env[60548]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 509.869818] env[60548]: stacktrace = ''.join(traceback.format_stack()) [ 509.869818] env[60548]: [ 509.871332] env[60548]: ERROR nova.db.main.api [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 509.871332] env[60548]: result = function(*args, **kwargs) [ 509.871332] env[60548]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 509.871332] env[60548]: return func(*args, **kwargs) [ 509.871332] env[60548]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 509.871332] env[60548]: result = fn(*args, **kwargs) [ 509.871332] env[60548]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 509.871332] env[60548]: return f(*args, **kwargs) [ 509.871332] env[60548]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 509.871332] env[60548]: return db.service_get_minimum_version(context, binaries) [ 509.871332] env[60548]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 509.871332] env[60548]: _check_db_access() [ 509.871332] env[60548]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 509.871332] env[60548]: stacktrace = ''.join(traceback.format_stack()) [ 509.871332] env[60548]: [ 509.871743] env[60548]: WARNING nova.objects.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] Failed to get minimum service version for cell 09b9d109-a207-40de-96b0-5d124568ab99 [ 509.871862] env[60548]: WARNING nova.objects.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 509.872276] env[60548]: DEBUG oslo_concurrency.lockutils [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] Acquiring lock "singleton_lock" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 509.872438] env[60548]: DEBUG oslo_concurrency.lockutils [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] Acquired lock "singleton_lock" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 509.872677] env[60548]: DEBUG oslo_concurrency.lockutils [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] Releasing lock "singleton_lock" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 509.873021] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] Full set of CONF: {{(pid=60548) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 509.873178] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ******************************************************************************** {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 509.873307] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] Configuration options gathered from: {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 509.873440] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 509.873634] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 509.873760] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ================================================================================ {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 509.873977] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] allow_resize_to_same_host = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.874182] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] arq_binding_timeout = 300 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.874313] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] backdoor_port = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.874439] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] backdoor_socket = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.874602] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] block_device_allocate_retries = 60 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.874761] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] block_device_allocate_retries_interval = 3 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.874928] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cert = self.pem {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.875111] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.875302] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] compute_monitors = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.875446] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] config_dir = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.875609] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] config_drive_format = iso9660 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.875740] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.875902] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] config_source = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.876075] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] console_host = devstack {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.876242] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] control_exchange = nova {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.876396] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cpu_allocation_ratio = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.876556] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] daemon = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.876717] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] debug = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.876870] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] default_access_ip_network_name = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.877041] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] default_availability_zone = nova {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.877200] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] default_ephemeral_format = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.877436] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.877596] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] default_schedule_zone = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.877749] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] disk_allocation_ratio = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.877936] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] enable_new_services = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.878106] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] enabled_apis = ['osapi_compute'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.878265] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] enabled_ssl_apis = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.878425] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] flat_injected = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.878580] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] force_config_drive = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.878734] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] force_raw_images = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.878898] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] graceful_shutdown_timeout = 5 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.879065] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] heal_instance_info_cache_interval = 60 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.879279] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] host = cpu-1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.879447] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] initial_cpu_allocation_ratio = 4.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.879603] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] initial_disk_allocation_ratio = 1.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.879768] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] initial_ram_allocation_ratio = 1.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.879974] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.880149] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] instance_build_timeout = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.880307] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] instance_delete_interval = 300 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.880470] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] instance_format = [instance: %(uuid)s] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.880631] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] instance_name_template = instance-%08x {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.880793] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] instance_usage_audit = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.880983] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] instance_usage_audit_period = month {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.881174] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.881338] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] instances_path = /opt/stack/data/nova/instances {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.881501] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] internal_service_availability_zone = internal {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.881670] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] key = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.881853] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] live_migration_retry_count = 30 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.882029] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] log_config_append = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.882207] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.882369] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] log_dir = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.882526] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] log_file = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.882652] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] log_options = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.882811] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] log_rotate_interval = 1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.882997] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] log_rotate_interval_type = days {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.883215] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] log_rotation_type = none {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.883356] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.883485] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.883656] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.883820] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.883957] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.884192] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] long_rpc_timeout = 1800 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.884371] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] max_concurrent_builds = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.884532] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] max_concurrent_live_migrations = 1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.884689] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] max_concurrent_snapshots = 5 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.884847] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] max_local_block_devices = 3 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.885011] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] max_logfile_count = 30 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.885178] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] max_logfile_size_mb = 200 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.885345] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] maximum_instance_delete_attempts = 5 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.885509] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] metadata_listen = 0.0.0.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.885676] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] metadata_listen_port = 8775 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.885843] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] metadata_workers = 2 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.886021] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] migrate_max_retries = -1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.886187] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] mkisofs_cmd = genisoimage {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.886404] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] my_block_storage_ip = 10.180.1.21 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.886538] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] my_ip = 10.180.1.21 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.886703] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] network_allocate_retries = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.886884] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.887064] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] osapi_compute_listen = 0.0.0.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.887288] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] osapi_compute_listen_port = 8774 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.887481] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] osapi_compute_unique_server_name_scope = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.887653] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] osapi_compute_workers = 2 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.887818] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] password_length = 12 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.887979] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] periodic_enable = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.888156] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] periodic_fuzzy_delay = 60 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.888324] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] pointer_model = usbtablet {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.888491] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] preallocate_images = none {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.888651] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] publish_errors = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.888780] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] pybasedir = /opt/stack/nova {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.888937] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ram_allocation_ratio = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.889152] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] rate_limit_burst = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.889362] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] rate_limit_except_level = CRITICAL {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.889528] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] rate_limit_interval = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.889686] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] reboot_timeout = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.889842] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] reclaim_instance_interval = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.889996] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] record = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.890209] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] reimage_timeout_per_gb = 60 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.890381] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] report_interval = 120 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.890542] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] rescue_timeout = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.890702] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] reserved_host_cpus = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.890859] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] reserved_host_disk_mb = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.891050] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] reserved_host_memory_mb = 512 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.891232] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] reserved_huge_pages = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.891398] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] resize_confirm_window = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.891561] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] resize_fs_using_block_device = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.891721] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] resume_guests_state_on_host_boot = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.891891] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.892086] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] rpc_response_timeout = 60 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.892257] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] run_external_periodic_tasks = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.892426] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] running_deleted_instance_action = reap {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.892587] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] running_deleted_instance_poll_interval = 1800 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.892753] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] running_deleted_instance_timeout = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.893024] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] scheduler_instance_sync_interval = 120 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.893191] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] service_down_time = 300 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.893373] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] servicegroup_driver = db {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.893538] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] shelved_offload_time = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.893698] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] shelved_poll_interval = 3600 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.893865] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] shutdown_timeout = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.894037] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] source_is_ipv6 = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.894193] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ssl_only = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.894455] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.894623] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] sync_power_state_interval = 600 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.894782] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] sync_power_state_pool_size = 1000 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.894948] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] syslog_log_facility = LOG_USER {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.895140] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] tempdir = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.895399] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] timeout_nbd = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.895508] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] transport_url = **** {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.895668] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] update_resources_interval = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.895827] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] use_cow_images = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.895986] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] use_eventlog = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.896160] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] use_journal = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.896318] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] use_json = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.896471] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] use_rootwrap_daemon = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.896624] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] use_stderr = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.896781] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] use_syslog = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.896933] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vcpu_pin_set = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.897114] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vif_plugging_is_fatal = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.897285] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vif_plugging_timeout = 300 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.897446] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] virt_mkfs = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.897604] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] volume_usage_poll_interval = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.897763] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] watch_log_file = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.897928] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] web = /usr/share/spice-html5 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 509.898132] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_concurrency.disable_process_locking = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.898452] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.898634] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.898796] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.898966] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_metrics.metrics_process_name = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.899153] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.899319] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.899497] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.auth_strategy = keystone {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.899663] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.compute_link_prefix = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.899836] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.900013] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.dhcp_domain = novalocal {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.900195] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.enable_instance_password = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.900357] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.glance_link_prefix = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.900520] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.900687] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.instance_list_cells_batch_strategy = distributed {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.900854] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.instance_list_per_project_cells = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.901067] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.list_records_by_skipping_down_cells = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.901248] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.local_metadata_per_cell = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.901415] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.max_limit = 1000 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.901580] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.metadata_cache_expiration = 15 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.901754] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.neutron_default_tenant_id = default {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.901919] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.use_forwarded_for = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.902094] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.use_neutron_default_nets = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.902266] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.902428] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.vendordata_dynamic_failure_fatal = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.902592] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.902762] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.vendordata_dynamic_ssl_certfile = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.902931] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.vendordata_dynamic_targets = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.903140] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.vendordata_jsonfile_path = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.903332] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api.vendordata_providers = ['StaticJSON'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.903525] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.backend = dogpile.cache.memcached {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.903693] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.backend_argument = **** {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.903862] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.config_prefix = cache.oslo {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.904042] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.dead_timeout = 60.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.904211] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.debug_cache_backend = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.904371] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.enable_retry_client = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.904529] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.enable_socket_keepalive = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.904698] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.enabled = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.904860] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.expiration_time = 600 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.905036] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.hashclient_retry_attempts = 2 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.905209] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.hashclient_retry_delay = 1.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.905371] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.memcache_dead_retry = 300 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.905538] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.memcache_password = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.905697] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.905856] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.906021] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.memcache_pool_maxsize = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.906182] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.memcache_pool_unused_timeout = 60 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.906344] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.memcache_sasl_enabled = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.906520] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.memcache_servers = ['localhost:11211'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.906684] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.memcache_socket_timeout = 1.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.906847] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.memcache_username = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.907025] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.proxies = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.907218] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.retry_attempts = 2 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.907393] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.retry_delay = 0.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.907558] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.socket_keepalive_count = 1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.907720] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.socket_keepalive_idle = 1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.907882] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.socket_keepalive_interval = 1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.908051] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.tls_allowed_ciphers = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.908215] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.tls_cafile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.908371] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.tls_certfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.908530] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.tls_enabled = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.908686] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cache.tls_keyfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.908857] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cinder.auth_section = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.909040] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cinder.auth_type = password {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.909212] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cinder.cafile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.909389] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cinder.catalog_info = volumev3::publicURL {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.909549] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cinder.certfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.909710] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cinder.collect_timing = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.909870] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cinder.cross_az_attach = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.910040] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cinder.debug = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.910205] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cinder.endpoint_template = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.910366] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cinder.http_retries = 3 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.910527] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cinder.insecure = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.910682] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cinder.keyfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.910853] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cinder.os_region_name = RegionOne {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.911073] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cinder.split_loggers = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.911253] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cinder.timeout = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.911426] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.911587] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] compute.cpu_dedicated_set = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.911750] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] compute.cpu_shared_set = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.911916] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] compute.image_type_exclude_list = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.912090] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] compute.live_migration_wait_for_vif_plug = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.912259] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] compute.max_concurrent_disk_ops = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.912421] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] compute.max_disk_devices_to_attach = -1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.912582] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.912750] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.912911] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] compute.resource_provider_association_refresh = 300 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.913116] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] compute.shutdown_retry_interval = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.913311] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.913495] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] conductor.workers = 2 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.913671] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] console.allowed_origins = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.913830] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] console.ssl_ciphers = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.913999] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] console.ssl_minimum_version = default {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.914188] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] consoleauth.token_ttl = 600 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.914355] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.cafile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.914509] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.certfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.914671] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.collect_timing = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.914826] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.connect_retries = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.914983] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.connect_retry_delay = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.915222] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.endpoint_override = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.915413] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.insecure = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.915574] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.keyfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.915737] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.max_version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.915896] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.min_version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.916065] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.region_name = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.916228] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.service_name = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.916397] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.service_type = accelerator {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.916558] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.split_loggers = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.916716] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.status_code_retries = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.916869] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.status_code_retry_delay = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.917034] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.timeout = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.917221] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.917393] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] cyborg.version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.917578] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.backend = sqlalchemy {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.917757] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.connection = **** {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.917927] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.connection_debug = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.918109] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.connection_parameters = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.918285] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.connection_recycle_time = 3600 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.918451] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.connection_trace = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.918612] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.db_inc_retry_interval = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.918775] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.db_max_retries = 20 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.918937] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.db_max_retry_interval = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.919110] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.db_retry_interval = 1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.919296] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.max_overflow = 50 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.919462] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.max_pool_size = 5 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.919632] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.max_retries = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.919797] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.mysql_enable_ndb = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.919967] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.mysql_sql_mode = TRADITIONAL {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.920143] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.mysql_wsrep_sync_wait = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.920306] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.pool_timeout = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.920471] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.retry_interval = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.920627] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.slave_connection = **** {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.920789] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.sqlite_synchronous = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.920976] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] database.use_db_reconnect = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.921276] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.backend = sqlalchemy {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.921483] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.connection = **** {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.921661] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.connection_debug = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.921888] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.connection_parameters = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.922102] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.connection_recycle_time = 3600 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.922284] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.connection_trace = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.922449] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.db_inc_retry_interval = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.922612] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.db_max_retries = 20 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.922772] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.db_max_retry_interval = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.922932] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.db_retry_interval = 1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.923146] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.max_overflow = 50 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.923318] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.max_pool_size = 5 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.923487] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.max_retries = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.923652] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.mysql_enable_ndb = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.923822] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.923999] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.mysql_wsrep_sync_wait = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.924190] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.pool_timeout = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.924360] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.retry_interval = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.924518] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.slave_connection = **** {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.924682] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] api_database.sqlite_synchronous = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.924855] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] devices.enabled_mdev_types = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.925039] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.925212] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ephemeral_storage_encryption.enabled = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.925374] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ephemeral_storage_encryption.key_size = 512 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.925542] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.api_servers = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.925705] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.cafile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.925864] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.certfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.926034] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.collect_timing = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.926198] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.connect_retries = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.926356] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.connect_retry_delay = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.928207] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.debug = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.928403] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.default_trusted_certificate_ids = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.928579] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.enable_certificate_validation = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.928780] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.enable_rbd_download = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.928949] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.endpoint_override = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.929132] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.insecure = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.929298] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.keyfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.929455] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.max_version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.929609] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.min_version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.929768] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.num_retries = 3 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.929934] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.rbd_ceph_conf = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.930114] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.rbd_connect_timeout = 5 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.930287] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.rbd_pool = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.930456] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.rbd_user = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.930613] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.region_name = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.930768] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.service_name = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.931036] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.service_type = image {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.931206] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.split_loggers = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.931372] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.status_code_retries = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.931533] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.status_code_retry_delay = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.931694] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.timeout = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.931878] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.932055] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.verify_glance_signatures = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.932225] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] glance.version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.932396] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] guestfs.debug = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.932568] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] hyperv.config_drive_cdrom = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.932734] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] hyperv.config_drive_inject_password = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.932904] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.933110] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] hyperv.enable_instance_metrics_collection = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.933286] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] hyperv.enable_remotefx = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.933456] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] hyperv.instances_path_share = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.933619] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] hyperv.iscsi_initiator_list = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.933781] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] hyperv.limit_cpu_features = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.933946] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.934146] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.934326] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] hyperv.power_state_check_timeframe = 60 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.934489] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] hyperv.power_state_event_polling_interval = 2 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.934660] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.934825] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] hyperv.use_multipath_io = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.934988] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] hyperv.volume_attach_retry_count = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.935188] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] hyperv.volume_attach_retry_interval = 5 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.935339] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] hyperv.vswitch_name = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.935499] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.935669] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] mks.enabled = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.936041] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.936240] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] image_cache.manager_interval = 2400 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.936411] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] image_cache.precache_concurrency = 1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.936581] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] image_cache.remove_unused_base_images = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.936751] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.936917] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.937106] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] image_cache.subdirectory_name = _base {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.937288] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.api_max_retries = 60 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.937451] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.api_retry_interval = 2 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.937606] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.auth_section = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.937764] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.auth_type = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.937920] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.cafile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.938086] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.certfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.938252] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.collect_timing = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.938407] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.connect_retries = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.938560] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.connect_retry_delay = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.938711] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.endpoint_override = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.938869] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.insecure = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.939033] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.keyfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.939191] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.max_version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.939342] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.min_version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.939495] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.partition_key = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.939657] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.peer_list = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.939806] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.region_name = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.939965] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.serial_console_state_timeout = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.940133] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.service_name = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.940315] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.service_type = baremetal {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.941011] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.split_loggers = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.941011] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.status_code_retries = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.941011] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.status_code_retry_delay = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.941011] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.timeout = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.941197] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.941329] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ironic.version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.941508] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.941682] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] key_manager.fixed_key = **** {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.941863] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.942052] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican.barbican_api_version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.948488] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican.barbican_endpoint = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.948488] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican.barbican_endpoint_type = public {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.948488] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican.barbican_region_name = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.948488] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican.cafile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.948488] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican.certfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.948488] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican.collect_timing = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.948488] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican.insecure = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.948896] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican.keyfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.948896] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican.number_of_retries = 60 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.948896] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican.retry_delay = 1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.948896] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican.send_service_user_token = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.948896] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican.split_loggers = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.948896] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican.timeout = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.948896] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican.verify_ssl = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949200] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican.verify_ssl_path = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949200] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican_service_user.auth_section = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949200] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican_service_user.auth_type = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949200] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican_service_user.cafile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949200] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican_service_user.certfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949200] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican_service_user.collect_timing = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949200] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican_service_user.insecure = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949386] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican_service_user.keyfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949386] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican_service_user.split_loggers = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949386] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] barbican_service_user.timeout = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949386] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vault.approle_role_id = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949386] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vault.approle_secret_id = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949386] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vault.cafile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949386] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vault.certfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949567] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vault.collect_timing = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949567] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vault.insecure = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949567] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vault.keyfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949567] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vault.kv_mountpoint = secret {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949567] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vault.kv_version = 2 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949567] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vault.namespace = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949567] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vault.root_token_id = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949761] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vault.split_loggers = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949761] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vault.ssl_ca_crt_file = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949761] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vault.timeout = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949761] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vault.use_ssl = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949761] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949761] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.cafile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.949761] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.certfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.950012] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.collect_timing = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.950012] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.connect_retries = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.950012] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.connect_retry_delay = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.950012] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.endpoint_override = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.950012] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.insecure = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.950012] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.keyfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.950012] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.max_version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.950198] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.min_version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.950198] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.region_name = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.950508] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.service_name = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.950508] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.service_type = identity {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.950624] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.split_loggers = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.950769] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.status_code_retries = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.950926] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.status_code_retry_delay = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.951091] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.timeout = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.951272] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.951428] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] keystone.version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.951625] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.connection_uri = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.951783] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.cpu_mode = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.951948] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.cpu_model_extra_flags = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.952127] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.cpu_models = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.952301] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.cpu_power_governor_high = performance {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.952567] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.cpu_power_governor_low = powersave {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.952644] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.cpu_power_management = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.952797] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.952982] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.device_detach_attempts = 8 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.953164] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.device_detach_timeout = 20 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.953336] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.disk_cachemodes = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.953494] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.disk_prefix = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.953656] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.enabled_perf_events = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.953818] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.file_backed_memory = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.954012] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.gid_maps = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.954189] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.hw_disk_discard = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.954348] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.hw_machine_type = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.954522] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.images_rbd_ceph_conf = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.954689] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.954855] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.955032] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.images_rbd_glance_store_name = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.955209] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.images_rbd_pool = rbd {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.955381] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.images_type = default {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.955537] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.images_volume_group = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.955699] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.inject_key = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.955856] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.inject_partition = -2 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.956028] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.inject_password = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.956185] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.iscsi_iface = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.956347] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.iser_use_multipath = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.956509] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.live_migration_bandwidth = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.956669] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.live_migration_completion_timeout = 800 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.956828] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.live_migration_downtime = 500 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.956990] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.live_migration_downtime_delay = 75 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.957168] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.live_migration_downtime_steps = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.957320] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.live_migration_inbound_addr = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.957479] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.live_migration_permit_auto_converge = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.957634] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.live_migration_permit_post_copy = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.957801] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.live_migration_scheme = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.957975] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.live_migration_timeout_action = abort {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.958152] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.live_migration_tunnelled = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.958309] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.live_migration_uri = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.958471] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.live_migration_with_native_tls = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.958630] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.max_queues = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.958790] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.mem_stats_period_seconds = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.958945] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.nfs_mount_options = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.959281] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.959458] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.num_aoe_discover_tries = 3 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.959620] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.num_iser_scan_tries = 5 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.959780] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.num_memory_encrypted_guests = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.959939] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.num_nvme_discover_tries = 5 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.960113] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.num_pcie_ports = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.960279] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.num_volume_scan_tries = 5 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.960441] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.pmem_namespaces = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.960597] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.quobyte_client_cfg = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.960894] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.961078] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.rbd_connect_timeout = 5 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.961249] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.961410] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.961567] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.rbd_secret_uuid = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.961720] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.rbd_user = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.961877] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.realtime_scheduler_priority = 1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.962056] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.remote_filesystem_transport = ssh {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.962222] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.rescue_image_id = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.962377] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.rescue_kernel_id = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.962530] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.rescue_ramdisk_id = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.962692] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.rng_dev_path = /dev/urandom {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.962847] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.rx_queue_size = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.963044] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.smbfs_mount_options = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.963350] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.963525] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.snapshot_compression = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.963686] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.snapshot_image_format = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.963906] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.964129] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.sparse_logical_volumes = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.964317] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.swtpm_enabled = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.964490] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.swtpm_group = tss {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.964658] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.swtpm_user = tss {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.964826] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.sysinfo_serial = unique {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.964984] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.tx_queue_size = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.965163] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.uid_maps = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.965330] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.use_virtio_for_bridges = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.965496] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.virt_type = kvm {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.965663] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.volume_clear = zero {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.965828] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.volume_clear_size = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.965994] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.volume_use_multipath = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.966194] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.vzstorage_cache_path = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.966388] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.966590] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.vzstorage_mount_group = qemu {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.966755] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.vzstorage_mount_opts = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.966928] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.967250] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.967434] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.vzstorage_mount_user = stack {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.967603] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.967776] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.auth_section = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.967948] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.auth_type = password {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.968127] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.cafile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.968289] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.certfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.968451] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.collect_timing = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.968608] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.connect_retries = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.968767] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.connect_retry_delay = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.968935] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.default_floating_pool = public {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.969104] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.endpoint_override = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.969271] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.extension_sync_interval = 600 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.969428] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.http_retries = 3 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.969590] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.insecure = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.969742] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.keyfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.969894] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.max_version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.970068] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.metadata_proxy_shared_secret = **** {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.970232] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.min_version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.970397] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.ovs_bridge = br-int {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.970558] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.physnets = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.970724] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.region_name = RegionOne {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.970893] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.service_metadata_proxy = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.971067] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.service_name = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.971246] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.service_type = network {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.971409] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.split_loggers = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.971566] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.status_code_retries = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.971724] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.status_code_retry_delay = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.971880] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.timeout = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.972076] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.972338] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] neutron.version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.972549] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] notifications.bdms_in_notifications = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.972735] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] notifications.default_level = INFO {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.972915] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] notifications.notification_format = unversioned {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.973119] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] notifications.notify_on_state_change = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.973311] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.973489] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] pci.alias = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.973660] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] pci.device_spec = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.973822] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] pci.report_in_placement = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.973994] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.auth_section = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.974213] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.auth_type = password {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.974390] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.auth_url = http://10.180.1.21/identity {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.974550] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.cafile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.974706] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.certfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.974871] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.collect_timing = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.975039] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.connect_retries = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.975206] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.connect_retry_delay = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.975365] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.default_domain_id = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.975522] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.default_domain_name = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.975677] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.domain_id = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.975832] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.domain_name = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.975990] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.endpoint_override = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.976165] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.insecure = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.976323] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.keyfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.976477] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.max_version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.976631] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.min_version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.976797] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.password = **** {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.976956] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.project_domain_id = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.977134] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.project_domain_name = Default {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.977303] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.project_id = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.977476] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.project_name = service {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.977639] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.region_name = RegionOne {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.977794] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.service_name = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.977960] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.service_type = placement {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.978137] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.split_loggers = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.978329] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.status_code_retries = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.978501] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.status_code_retry_delay = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.978659] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.system_scope = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.978815] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.timeout = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.978972] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.trust_id = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.979144] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.user_domain_id = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.979312] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.user_domain_name = Default {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.979473] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.user_id = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.979644] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.username = placement {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.979826] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.979986] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] placement.version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.980180] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] quota.cores = 20 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.980348] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] quota.count_usage_from_placement = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.980522] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.980696] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] quota.injected_file_content_bytes = 10240 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.980863] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] quota.injected_file_path_length = 255 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.981040] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] quota.injected_files = 5 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.981215] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] quota.instances = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.981381] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] quota.key_pairs = 100 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.981547] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] quota.metadata_items = 128 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.981713] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] quota.ram = 51200 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.981879] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] quota.recheck_quota = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.982056] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] quota.server_group_members = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.982230] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] quota.server_groups = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.982400] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] rdp.enabled = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.982719] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.982913] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.983124] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.983302] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] scheduler.image_metadata_prefilter = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.983468] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.983636] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] scheduler.max_attempts = 3 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.983804] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] scheduler.max_placement_results = 1000 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.983977] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.984171] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] scheduler.query_placement_for_availability_zone = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.984339] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] scheduler.query_placement_for_image_type_support = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.984501] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.984676] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] scheduler.workers = 2 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.984848] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.985028] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.985220] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.985387] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.985555] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.985719] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.985876] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.986074] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.986245] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.host_subset_size = 1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.986405] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.image_properties_default_architecture = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.986566] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.986730] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.isolated_hosts = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.986893] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.isolated_images = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.987064] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.max_instances_per_host = 50 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.987233] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.987398] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.pci_in_placement = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.987560] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.987718] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.987880] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.988050] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.988220] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.988384] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.988543] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.track_instance_changes = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.988716] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.988883] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] metrics.required = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.989056] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] metrics.weight_multiplier = 1.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.989223] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] metrics.weight_of_unavailable = -10000.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.989386] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] metrics.weight_setting = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.989695] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.989869] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] serial_console.enabled = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.990048] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] serial_console.port_range = 10000:20000 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.990223] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.990390] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.990560] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] serial_console.serialproxy_port = 6083 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.990724] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] service_user.auth_section = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.990893] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] service_user.auth_type = password {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.991061] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] service_user.cafile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.991221] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] service_user.certfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.991384] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] service_user.collect_timing = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.991538] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] service_user.insecure = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.991692] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] service_user.keyfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.991858] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] service_user.send_service_user_token = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.992029] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] service_user.split_loggers = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.992190] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] service_user.timeout = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.992360] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] spice.agent_enabled = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.992536] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] spice.enabled = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.992844] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.993053] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] spice.html5proxy_host = 0.0.0.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.993226] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] spice.html5proxy_port = 6082 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.993387] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] spice.image_compression = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.993547] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] spice.jpeg_compression = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.993706] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] spice.playback_compression = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.993873] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] spice.server_listen = 127.0.0.1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.994056] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.994219] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] spice.streaming_mode = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.994375] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] spice.zlib_compression = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.994540] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] upgrade_levels.baseapi = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.994696] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] upgrade_levels.cert = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.994868] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] upgrade_levels.compute = auto {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.995038] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] upgrade_levels.conductor = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.995211] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] upgrade_levels.scheduler = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.995380] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vendordata_dynamic_auth.auth_section = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.995542] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vendordata_dynamic_auth.auth_type = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.995701] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vendordata_dynamic_auth.cafile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.995860] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vendordata_dynamic_auth.certfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.996039] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vendordata_dynamic_auth.collect_timing = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.996207] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vendordata_dynamic_auth.insecure = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.996364] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vendordata_dynamic_auth.keyfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.996525] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vendordata_dynamic_auth.split_loggers = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.996681] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vendordata_dynamic_auth.timeout = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.996853] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.api_retry_count = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.997021] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.ca_file = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.997201] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.cache_prefix = devstack-image-cache {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.997368] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.cluster_name = testcl1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.997535] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.connection_pool_size = 10 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.997692] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.console_delay_seconds = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.997861] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.datastore_regex = ^datastore.* {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.998090] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.998268] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.host_password = **** {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.998434] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.host_port = 443 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.998603] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.host_username = administrator@vsphere.local {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.998774] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.insecure = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.998938] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.integration_bridge = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.999112] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.maximum_objects = 100 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.999274] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.pbm_default_policy = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.999437] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.pbm_enabled = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.999592] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.pbm_wsdl_location = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.999760] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 509.999917] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.serial_port_proxy_uri = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.000082] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.serial_port_service_uri = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.000256] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.task_poll_interval = 0.5 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.000431] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.use_linked_clone = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.000598] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.vnc_keymap = en-us {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.000763] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.vnc_port = 5900 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.000934] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vmware.vnc_port_total = 10000 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.001130] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vnc.auth_schemes = ['none'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.001313] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vnc.enabled = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.001618] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.001812] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.001982] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vnc.novncproxy_port = 6080 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.002176] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vnc.server_listen = 127.0.0.1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.002353] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.002517] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vnc.vencrypt_ca_certs = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.002677] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vnc.vencrypt_client_cert = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.002834] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vnc.vencrypt_client_key = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.003028] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.003194] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.disable_deep_image_inspection = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.003362] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.disable_fallback_pcpu_query = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.003525] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.disable_group_policy_check_upcall = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.003689] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.003853] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.disable_rootwrap = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.004031] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.enable_numa_live_migration = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.004191] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.004395] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.004570] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.handle_virt_lifecycle_events = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.004733] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.libvirt_disable_apic = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.004894] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.never_download_image_if_on_rbd = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.005084] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.005303] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.005438] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.005600] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.005762] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.005921] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.006108] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.006290] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.006455] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.006640] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.006810] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] wsgi.client_socket_timeout = 900 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.006977] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] wsgi.default_pool_size = 1000 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.007181] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] wsgi.keep_alive = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.007357] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] wsgi.max_header_line = 16384 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.007544] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] wsgi.secure_proxy_ssl_header = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.007801] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] wsgi.ssl_ca_file = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.008077] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] wsgi.ssl_cert_file = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.008283] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] wsgi.ssl_key_file = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.008459] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] wsgi.tcp_keepidle = 600 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.008639] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.008807] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] zvm.ca_file = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.008971] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] zvm.cloud_connector_url = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.009311] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.009498] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] zvm.reachable_timeout = 300 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.009679] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_policy.enforce_new_defaults = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.009850] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_policy.enforce_scope = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.010044] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_policy.policy_default_rule = default {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.010255] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.010436] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_policy.policy_file = policy.yaml {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.010611] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.010772] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.010929] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.011119] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.011295] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.011468] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.011645] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.011823] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] profiler.connection_string = messaging:// {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.011992] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] profiler.enabled = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.012199] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] profiler.es_doc_type = notification {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.012368] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] profiler.es_scroll_size = 10000 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.012540] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] profiler.es_scroll_time = 2m {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.012702] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] profiler.filter_error_trace = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.012869] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] profiler.hmac_keys = SECRET_KEY {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.013085] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] profiler.sentinel_service_name = mymaster {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.013283] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] profiler.socket_timeout = 0.1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.013452] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] profiler.trace_sqlalchemy = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.013621] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] remote_debug.host = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.013781] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] remote_debug.port = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.013964] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.014179] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.014356] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.014520] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.014682] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.014863] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.015008] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.015200] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.015367] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.015525] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.015696] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.015863] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.016048] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.016236] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.016402] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.016577] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.016741] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.016954] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.017167] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.017346] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.017508] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.017675] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.017836] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.017997] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.018208] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.018382] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.ssl = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.018557] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.018729] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.018895] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.019085] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.019265] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_rabbit.ssl_version = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.019457] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.019624] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_notifications.retry = -1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.019807] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.019981] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_messaging_notifications.transport_url = **** {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.020169] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.auth_section = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.020333] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.auth_type = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.020492] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.cafile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.020647] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.certfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.020805] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.collect_timing = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.020989] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.connect_retries = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.021168] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.connect_retry_delay = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.021338] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.endpoint_id = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.021497] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.endpoint_override = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.021655] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.insecure = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.021809] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.keyfile = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.021962] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.max_version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.022130] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.min_version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.022284] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.region_name = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.022436] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.service_name = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.022590] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.service_type = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.022747] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.split_loggers = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.022903] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.status_code_retries = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.023095] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.status_code_retry_delay = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.023263] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.timeout = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.023424] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.valid_interfaces = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.023583] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_limit.version = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.023744] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_reports.file_event_handler = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.023907] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_reports.file_event_handler_interval = 1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.024091] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] oslo_reports.log_dir = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.024281] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.024444] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vif_plug_linux_bridge_privileged.group = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.024603] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.024769] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.024933] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.025102] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vif_plug_linux_bridge_privileged.user = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.025276] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.025432] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vif_plug_ovs_privileged.group = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.025587] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vif_plug_ovs_privileged.helper_command = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.025749] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.025908] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.026074] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] vif_plug_ovs_privileged.user = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.026248] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] os_vif_linux_bridge.flat_interface = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.026429] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.026600] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.026769] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.026936] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.027115] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.027287] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.027445] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] os_vif_linux_bridge.vlan_interface = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.027624] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] os_vif_ovs.isolate_vif = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.027789] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.027956] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.028141] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.028315] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] os_vif_ovs.ovsdb_interface = native {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.028476] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] os_vif_ovs.per_port_bridge = False {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.028638] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] os_brick.lock_path = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.028801] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.029242] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] os_brick.wait_mpath_device_interval = 1 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.029242] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] privsep_osbrick.capabilities = [21] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.029330] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] privsep_osbrick.group = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.029439] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] privsep_osbrick.helper_command = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.029605] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.029771] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] privsep_osbrick.thread_pool_size = 8 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.029925] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] privsep_osbrick.user = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.030106] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.030267] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] nova_sys_admin.group = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.030423] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] nova_sys_admin.helper_command = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.030584] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.030746] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] nova_sys_admin.thread_pool_size = 8 {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.030921] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] nova_sys_admin.user = None {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 510.031104] env[60548]: DEBUG oslo_service.service [None req-dac5d965-1145-45e0-9abe-b2363f81ad5f None None] ******************************************************************************** {{(pid=60548) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 510.031553] env[60548]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 510.040930] env[60548]: INFO nova.virt.node [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Generated node identity 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 [ 510.041192] env[60548]: INFO nova.virt.node [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Wrote node identity 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 to /opt/stack/data/n-cpu-1/compute_id [ 510.052592] env[60548]: WARNING nova.compute.manager [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Compute nodes ['3c0a58fa-f44f-43ae-bee7-c3032edaaa64'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 510.086569] env[60548]: INFO nova.compute.manager [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 510.110808] env[60548]: WARNING nova.compute.manager [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 510.111149] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 510.111287] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 510.111433] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 510.111589] env[60548]: DEBUG nova.compute.resource_tracker [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60548) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 510.112742] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff0ceb04-1b23-4a0b-83bc-705b1705031f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 510.122443] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44a227dc-16b5-4bcb-a0ad-52acb8926475 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 510.137475] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38b5431e-e64e-4a0b-815c-02b01b13ce91 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 510.144552] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2e33187-ecab-4448-9e8e-dae03554c567 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 510.174660] env[60548]: DEBUG nova.compute.resource_tracker [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180705MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60548) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 510.174838] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 510.175013] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 510.188829] env[60548]: WARNING nova.compute.resource_tracker [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] No compute node record for cpu-1:3c0a58fa-f44f-43ae-bee7-c3032edaaa64: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 could not be found. [ 510.201959] env[60548]: INFO nova.compute.resource_tracker [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 [ 510.256850] env[60548]: DEBUG nova.compute.resource_tracker [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 510.257026] env[60548]: DEBUG nova.compute.resource_tracker [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=100GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 510.367068] env[60548]: INFO nova.scheduler.client.report [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] [req-752b1f84-1e0b-4133-ad3e-beef34904149] Created resource provider record via placement API for resource provider with UUID 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 510.384916] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0aa8969a-0c95-4b13-a9cf-1bdd3db5bb23 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 510.393318] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f80585f2-d5f0-4d8b-bffa-5c6abfa7b2f3 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 510.423654] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee69352f-94f1-4ad2-bedc-50a375a9bf03 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 510.431729] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e79865a7-0277-4458-a8fb-202097d4f48e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 510.445654] env[60548]: DEBUG nova.compute.provider_tree [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Updating inventory in ProviderTree for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 510.481264] env[60548]: DEBUG nova.scheduler.client.report [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Updated inventory for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 510.481493] env[60548]: DEBUG nova.compute.provider_tree [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Updating resource provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 generation from 0 to 1 during operation: update_inventory {{(pid=60548) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 510.481635] env[60548]: DEBUG nova.compute.provider_tree [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Updating inventory in ProviderTree for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 510.524761] env[60548]: DEBUG nova.compute.provider_tree [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Updating resource provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 generation from 1 to 2 during operation: update_traits {{(pid=60548) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 510.543261] env[60548]: DEBUG nova.compute.resource_tracker [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60548) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 510.543473] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.368s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 510.543615] env[60548]: DEBUG nova.service [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Creating RPC server for service compute {{(pid=60548) start /opt/stack/nova/nova/service.py:182}} [ 510.556750] env[60548]: DEBUG nova.service [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] Join ServiceGroup membership for this service compute {{(pid=60548) start /opt/stack/nova/nova/service.py:199}} [ 510.556944] env[60548]: DEBUG nova.servicegroup.drivers.db [None req-ccd3c1dc-5be2-480f-9cf7-c6f7841f3db5 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=60548) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 532.562214] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 532.572984] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Getting list of instances from cluster (obj){ [ 532.572984] env[60548]: value = "domain-c8" [ 532.572984] env[60548]: _type = "ClusterComputeResource" [ 532.572984] env[60548]: } {{(pid=60548) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 532.574154] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdbae61a-be15-4ee7-b430-7392a8985c2b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 532.583675] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Got total of 0 instances {{(pid=60548) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 532.583912] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 532.584250] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Getting list of instances from cluster (obj){ [ 532.584250] env[60548]: value = "domain-c8" [ 532.584250] env[60548]: _type = "ClusterComputeResource" [ 532.584250] env[60548]: } {{(pid=60548) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 532.585110] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5aae637-ca99-4b45-89b7-756d8e492a68 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 532.592957] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Got total of 0 instances {{(pid=60548) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 555.140588] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Acquiring lock "b974272a-5c32-4ed2-99db-1b1ac744d08c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 555.140966] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Lock "b974272a-5c32-4ed2-99db-1b1ac744d08c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 555.159363] env[60548]: DEBUG nova.compute.manager [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 555.270014] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 555.270370] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 555.271927] env[60548]: INFO nova.compute.claims [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 555.411630] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76be4dcd-c5a4-4b44-bb85-cb61919a4364 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.432189] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49e4edde-530c-41ac-b998-0b9d5786bbab {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.468115] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc90faaa-9e95-4139-bd83-3a5f7d1ab439 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.476939] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27126860-6055-44df-aea7-55b7d42a1fd3 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.496238] env[60548]: DEBUG nova.compute.provider_tree [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 555.508254] env[60548]: DEBUG nova.scheduler.client.report [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 555.533208] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 555.534316] env[60548]: DEBUG nova.compute.manager [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 555.583050] env[60548]: DEBUG nova.compute.utils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 555.586902] env[60548]: DEBUG nova.compute.manager [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 555.586902] env[60548]: DEBUG nova.network.neutron [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 555.604374] env[60548]: DEBUG nova.compute.manager [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 555.711163] env[60548]: DEBUG nova.compute.manager [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 555.891021] env[60548]: DEBUG nova.virt.hardware [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 555.891021] env[60548]: DEBUG nova.virt.hardware [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 555.891021] env[60548]: DEBUG nova.virt.hardware [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 555.891427] env[60548]: DEBUG nova.virt.hardware [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 555.891427] env[60548]: DEBUG nova.virt.hardware [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 555.891427] env[60548]: DEBUG nova.virt.hardware [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 555.891427] env[60548]: DEBUG nova.virt.hardware [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 555.891427] env[60548]: DEBUG nova.virt.hardware [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 555.891646] env[60548]: DEBUG nova.virt.hardware [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 555.891646] env[60548]: DEBUG nova.virt.hardware [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 555.892222] env[60548]: DEBUG nova.virt.hardware [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 555.893663] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ebd198d-69b4-42a6-a316-f6162ad00705 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.909585] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbcc2bd3-8836-4091-9161-0f64035e4268 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 555.942908] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40617f03-77bb-4941-b668-f1f25cacaae6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 556.019366] env[60548]: DEBUG nova.policy [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '04842364b6554b0098443e52c3519d4b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c1b2e2fde40e4ec799922e3ac4780416', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 557.235851] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquiring lock "3a7076f4-fc00-4f82-804b-4dac0de9ab3d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 557.236124] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Lock "3a7076f4-fc00-4f82-804b-4dac0de9ab3d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 557.265194] env[60548]: DEBUG nova.compute.manager [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 557.377538] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 557.377803] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 557.382059] env[60548]: INFO nova.compute.claims [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 557.401649] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Acquiring lock "31e48a76-ffbc-4bd2-a01f-2a69df2de5f5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 557.403853] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Lock "31e48a76-ffbc-4bd2-a01f-2a69df2de5f5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 557.415032] env[60548]: DEBUG nova.compute.manager [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 557.481922] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 557.520385] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Acquiring lock "d76e8d11-53d3-417d-b6a6-08bdff8165d5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 557.521657] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Lock "d76e8d11-53d3-417d-b6a6-08bdff8165d5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 557.537112] env[60548]: DEBUG nova.compute.manager [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 557.565294] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82935f2d-7c27-4d08-bc5d-9cfcd82e680f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.575621] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02f395c9-4559-4a94-a032-6d2e434196ea {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.614305] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1da76ae0-7a03-405f-8430-2699c9baa6b6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.624917] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-389f6df7-3afa-4c90-9470-5fe45aaacc6c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.629624] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 557.643222] env[60548]: DEBUG nova.compute.provider_tree [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 557.653790] env[60548]: DEBUG nova.scheduler.client.report [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 557.669989] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 557.670562] env[60548]: DEBUG nova.compute.manager [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 557.673896] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.192s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 557.677524] env[60548]: INFO nova.compute.claims [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 557.722899] env[60548]: DEBUG nova.compute.utils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 557.723452] env[60548]: DEBUG nova.compute.manager [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 557.723808] env[60548]: DEBUG nova.network.neutron [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 557.741135] env[60548]: DEBUG nova.compute.manager [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 557.842932] env[60548]: DEBUG nova.compute.manager [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 557.849843] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-517ef178-eaaa-42e8-b50d-e7a49c96e5fd {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.863277] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3113c8c8-9365-44e8-9280-10fc84ede34b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.899524] env[60548]: DEBUG nova.virt.hardware [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 557.899867] env[60548]: DEBUG nova.virt.hardware [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 557.900342] env[60548]: DEBUG nova.virt.hardware [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 557.900342] env[60548]: DEBUG nova.virt.hardware [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 557.900522] env[60548]: DEBUG nova.virt.hardware [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 557.900711] env[60548]: DEBUG nova.virt.hardware [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 557.900962] env[60548]: DEBUG nova.virt.hardware [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 557.901235] env[60548]: DEBUG nova.virt.hardware [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 557.901457] env[60548]: DEBUG nova.virt.hardware [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 557.901685] env[60548]: DEBUG nova.virt.hardware [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 557.901902] env[60548]: DEBUG nova.virt.hardware [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 557.902760] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bcbd276-0c0b-4d11-ab52-102133c0098a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.906278] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da522e57-593b-484a-bed4-17a44a7ae9f2 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.915645] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e637cee9-b3d3-46a5-9e25-0c404603cb42 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.933963] env[60548]: DEBUG nova.compute.provider_tree [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 557.938872] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e348bc1-3ccd-43a5-9b36-6730941115e1 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.955147] env[60548]: DEBUG nova.scheduler.client.report [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 557.974121] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.300s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 557.974620] env[60548]: DEBUG nova.compute.manager [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 557.977804] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.348s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 557.981113] env[60548]: INFO nova.compute.claims [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 558.031363] env[60548]: DEBUG nova.compute.utils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 558.035865] env[60548]: DEBUG nova.compute.manager [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 558.036127] env[60548]: DEBUG nova.network.neutron [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 558.054242] env[60548]: DEBUG nova.compute.manager [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 558.077290] env[60548]: DEBUG nova.policy [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'de88047992ae4098acd029bd2bd55b1f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8a91009401dd409ca662573757dfaf88', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 558.139125] env[60548]: DEBUG nova.compute.manager [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 558.164032] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3124417c-8650-48a3-97fb-e36c9a4c43d8 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.168977] env[60548]: DEBUG nova.virt.hardware [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 558.169175] env[60548]: DEBUG nova.virt.hardware [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 558.169362] env[60548]: DEBUG nova.virt.hardware [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 558.169572] env[60548]: DEBUG nova.virt.hardware [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 558.169768] env[60548]: DEBUG nova.virt.hardware [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 558.170091] env[60548]: DEBUG nova.virt.hardware [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 558.170243] env[60548]: DEBUG nova.virt.hardware [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 558.171352] env[60548]: DEBUG nova.virt.hardware [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 558.171352] env[60548]: DEBUG nova.virt.hardware [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 558.171352] env[60548]: DEBUG nova.virt.hardware [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 558.171352] env[60548]: DEBUG nova.virt.hardware [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 558.171809] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-734c5f02-9c50-48a7-a67e-724f69c43c8f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.187648] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55a6c368-febe-45cb-8826-bcba28723abf {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.192712] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4592e2ad-1431-478f-90c0-2f06cd24138a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.197839] env[60548]: DEBUG nova.network.neutron [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Successfully created port: 8bfce6a0-ed43-42e6-8285-8985812271fa {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 558.238846] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9be4f63-d1e1-47bb-b83a-5d66fb0dc122 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.247948] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01e288d1-26b7-429c-b91e-faf54aeacc00 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.262891] env[60548]: DEBUG nova.compute.provider_tree [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 558.274717] env[60548]: DEBUG nova.scheduler.client.report [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 558.298655] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.321s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 558.299162] env[60548]: DEBUG nova.compute.manager [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 558.351637] env[60548]: DEBUG nova.compute.utils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 558.354944] env[60548]: DEBUG nova.compute.manager [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Not allocating networking since 'none' was specified. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 558.368392] env[60548]: DEBUG nova.compute.manager [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 558.378112] env[60548]: DEBUG nova.policy [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4c2b8b857ba94465bb210580868b9c63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '911556aa558a4d248043490afec5ee5e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 558.459135] env[60548]: DEBUG nova.compute.manager [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 558.488149] env[60548]: DEBUG nova.virt.hardware [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 558.488484] env[60548]: DEBUG nova.virt.hardware [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 558.488571] env[60548]: DEBUG nova.virt.hardware [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 558.492259] env[60548]: DEBUG nova.virt.hardware [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 558.492259] env[60548]: DEBUG nova.virt.hardware [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 558.492259] env[60548]: DEBUG nova.virt.hardware [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 558.492259] env[60548]: DEBUG nova.virt.hardware [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 558.492259] env[60548]: DEBUG nova.virt.hardware [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 558.492436] env[60548]: DEBUG nova.virt.hardware [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 558.492436] env[60548]: DEBUG nova.virt.hardware [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 558.492436] env[60548]: DEBUG nova.virt.hardware [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 558.493364] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be9d0199-0387-425a-b41c-5adb2c9170e0 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.502540] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b516c265-d852-4e6a-a592-23d5ba8fd5c7 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.518476] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Instance VIF info [] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 558.529945] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 558.530768] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f2d068a8-1b15-4806-a21f-6eafcae3f7c4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.543859] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Created folder: OpenStack in parent group-v4. [ 558.544083] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Creating folder: Project (d9c6fc1f21434ba8adf24e3dea4eba36). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 558.544328] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0bb9ab6f-685c-4aed-be16-60e26c791b51 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.556953] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Created folder: Project (d9c6fc1f21434ba8adf24e3dea4eba36) in parent group-v850287. [ 558.557242] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Creating folder: Instances. Parent ref: group-v850288. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 558.557407] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4930626d-ef18-4193-88dd-0318bb1c9a87 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.567809] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Created folder: Instances in parent group-v850288. [ 558.568089] env[60548]: DEBUG oslo.service.loopingcall [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 558.568357] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 558.568471] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1ef63191-9b40-42e7-b3e5-de0736d56d48 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.586790] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 558.586790] env[60548]: value = "task-4323277" [ 558.586790] env[60548]: _type = "Task" [ 558.586790] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 558.595927] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323277, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 559.099986] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323277, 'name': CreateVM_Task} progress is 25%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 559.411352] env[60548]: DEBUG nova.network.neutron [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Successfully created port: 610d3091-1314-4a6e-94f4-ccfcebc1cc01 {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 559.601953] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323277, 'name': CreateVM_Task} progress is 25%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 560.100387] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323277, 'name': CreateVM_Task} progress is 25%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 560.184680] env[60548]: DEBUG nova.network.neutron [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Successfully created port: fe9b78f9-9aa4-482f-8d36-4b5c359f7121 {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 560.601601] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323277, 'name': CreateVM_Task} progress is 99%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 561.107682] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323277, 'name': CreateVM_Task, 'duration_secs': 2.072364} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 561.107682] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 561.109451] env[60548]: DEBUG oslo_vmware.service [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85c868eb-2fb2-4417-92bf-19497b7f3c33 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.120363] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 561.120363] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 561.120449] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 561.120734] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1d543bbe-78cf-4cef-a5db-cb8eca551c27 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.129518] env[60548]: DEBUG oslo_vmware.api [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Waiting for the task: (returnval){ [ 561.129518] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52cd6b8d-f31b-a719-d591-ebe175c06a16" [ 561.129518] env[60548]: _type = "Task" [ 561.129518] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 561.144484] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 561.147456] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 561.147743] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 561.147988] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 561.148463] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 561.148730] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-233ac1fc-dec4-40a1-b8e1-1d950792da2d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.171546] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 561.171772] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 561.172804] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-227486fa-cd0b-4d45-80fb-e477c8066ca6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.182033] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cf0b3392-7c73-45c9-83ca-1c8f56cc4bcc {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.189654] env[60548]: DEBUG oslo_vmware.api [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Waiting for the task: (returnval){ [ 561.189654] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]5244c604-fdb7-b2c6-9c07-21bcba6ae995" [ 561.189654] env[60548]: _type = "Task" [ 561.189654] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 561.202402] env[60548]: DEBUG oslo_vmware.api [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]5244c604-fdb7-b2c6-9c07-21bcba6ae995, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 561.499676] env[60548]: DEBUG nova.network.neutron [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Successfully updated port: 610d3091-1314-4a6e-94f4-ccfcebc1cc01 {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 561.516488] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Acquiring lock "refresh_cache-31e48a76-ffbc-4bd2-a01f-2a69df2de5f5" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 561.516488] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Acquired lock "refresh_cache-31e48a76-ffbc-4bd2-a01f-2a69df2de5f5" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 561.516488] env[60548]: DEBUG nova.network.neutron [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 561.611198] env[60548]: DEBUG nova.network.neutron [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 561.704921] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 561.705250] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Creating directory with path [datastore1] vmware_temp/f5a9fe2e-f6fc-4c3c-8902-89ba1d50383c/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 561.705505] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-07057a69-b329-4ee5-8dd7-c899bf21682b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.727739] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Created directory with path [datastore1] vmware_temp/f5a9fe2e-f6fc-4c3c-8902-89ba1d50383c/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 561.727940] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Fetch image to [datastore1] vmware_temp/f5a9fe2e-f6fc-4c3c-8902-89ba1d50383c/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 561.728127] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/f5a9fe2e-f6fc-4c3c-8902-89ba1d50383c/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 561.728928] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-701e6ca9-2c0b-427b-b37f-5ecb60703181 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.739104] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ee0b174-34b9-4bf7-95d4-17225ee7b87b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.750664] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ca3f2ab-91db-4645-a404-122a7d027b26 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.782841] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d58d6e2b-d7d5-48f6-89f7-c83b33c80f67 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.790411] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-77a80b20-ca4e-42a9-8678-a7b4b96e81fa {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.886911] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 561.960037] env[60548]: DEBUG oslo_vmware.rw_handles [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f5a9fe2e-f6fc-4c3c-8902-89ba1d50383c/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 562.032121] env[60548]: DEBUG nova.network.neutron [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Updating instance_info_cache with network_info: [{"id": "610d3091-1314-4a6e-94f4-ccfcebc1cc01", "address": "fa:16:3e:96:ed:0e", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap610d3091-13", "ovs_interfaceid": "610d3091-1314-4a6e-94f4-ccfcebc1cc01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 562.037276] env[60548]: DEBUG oslo_vmware.rw_handles [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Completed reading data from the image iterator. {{(pid=60548) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 562.037276] env[60548]: DEBUG oslo_vmware.rw_handles [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f5a9fe2e-f6fc-4c3c-8902-89ba1d50383c/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 562.050822] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Releasing lock "refresh_cache-31e48a76-ffbc-4bd2-a01f-2a69df2de5f5" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 562.050822] env[60548]: DEBUG nova.compute.manager [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Instance network_info: |[{"id": "610d3091-1314-4a6e-94f4-ccfcebc1cc01", "address": "fa:16:3e:96:ed:0e", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap610d3091-13", "ovs_interfaceid": "610d3091-1314-4a6e-94f4-ccfcebc1cc01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 562.050949] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:96:ed:0e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '15922696-dc08-44ef-97be-0b09a9dfeae8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '610d3091-1314-4a6e-94f4-ccfcebc1cc01', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 562.061728] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Creating folder: Project (911556aa558a4d248043490afec5ee5e). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 562.062817] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fda8ea50-b4a9-4c2d-b6fa-1e3d9d6439e9 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.076666] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Created folder: Project (911556aa558a4d248043490afec5ee5e) in parent group-v850287. [ 562.076858] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Creating folder: Instances. Parent ref: group-v850291. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 562.077111] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-45668911-d0b1-4f71-a9bf-3470d035387a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.088191] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Created folder: Instances in parent group-v850291. [ 562.088430] env[60548]: DEBUG oslo.service.loopingcall [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 562.088613] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 562.088817] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6b4b35d9-cf97-4bd4-804d-cec2bc9ffac6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.112498] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 562.112498] env[60548]: value = "task-4323280" [ 562.112498] env[60548]: _type = "Task" [ 562.112498] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 562.123368] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323280, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 562.191042] env[60548]: DEBUG nova.network.neutron [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Successfully updated port: 8bfce6a0-ed43-42e6-8285-8985812271fa {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 562.207661] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Acquiring lock "refresh_cache-b974272a-5c32-4ed2-99db-1b1ac744d08c" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 562.207832] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Acquired lock "refresh_cache-b974272a-5c32-4ed2-99db-1b1ac744d08c" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 562.207957] env[60548]: DEBUG nova.network.neutron [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 562.379985] env[60548]: DEBUG nova.network.neutron [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 562.630937] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323280, 'name': CreateVM_Task, 'duration_secs': 0.375806} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 562.632172] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 562.648190] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 562.648851] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 562.648851] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 562.649545] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-83263e66-3dab-4e27-b885-45a419b44f22 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.657288] env[60548]: DEBUG oslo_vmware.api [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Waiting for the task: (returnval){ [ 562.657288] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52acac02-79a6-88a1-c565-d48739cdb46e" [ 562.657288] env[60548]: _type = "Task" [ 562.657288] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 562.667452] env[60548]: DEBUG oslo_vmware.api [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52acac02-79a6-88a1-c565-d48739cdb46e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 563.174449] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 563.174725] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 563.174984] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 563.479760] env[60548]: DEBUG nova.network.neutron [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Updating instance_info_cache with network_info: [{"id": "8bfce6a0-ed43-42e6-8285-8985812271fa", "address": "fa:16:3e:6c:ce:b5", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.59", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8bfce6a0-ed", "ovs_interfaceid": "8bfce6a0-ed43-42e6-8285-8985812271fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 563.502961] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Releasing lock "refresh_cache-b974272a-5c32-4ed2-99db-1b1ac744d08c" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 563.503409] env[60548]: DEBUG nova.compute.manager [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Instance network_info: |[{"id": "8bfce6a0-ed43-42e6-8285-8985812271fa", "address": "fa:16:3e:6c:ce:b5", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.59", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8bfce6a0-ed", "ovs_interfaceid": "8bfce6a0-ed43-42e6-8285-8985812271fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 563.503742] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6c:ce:b5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '15922696-dc08-44ef-97be-0b09a9dfeae8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8bfce6a0-ed43-42e6-8285-8985812271fa', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 563.515121] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Creating folder: Project (c1b2e2fde40e4ec799922e3ac4780416). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 563.516048] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d963b24e-a330-4fa1-ab90-82b0f87de6ca {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.528904] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Created folder: Project (c1b2e2fde40e4ec799922e3ac4780416) in parent group-v850287. [ 563.529126] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Creating folder: Instances. Parent ref: group-v850294. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 563.529364] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ce0cebd8-175e-47b3-bfc1-58271b4dc54e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.545625] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Created folder: Instances in parent group-v850294. [ 563.545625] env[60548]: DEBUG oslo.service.loopingcall [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 563.546198] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 563.546904] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0d97eedf-27b1-4a60-a55f-ce2217fe702f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.575024] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 563.575024] env[60548]: value = "task-4323283" [ 563.575024] env[60548]: _type = "Task" [ 563.575024] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 563.586575] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323283, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 563.703367] env[60548]: DEBUG nova.network.neutron [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Successfully updated port: fe9b78f9-9aa4-482f-8d36-4b5c359f7121 {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 563.714369] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquiring lock "refresh_cache-3a7076f4-fc00-4f82-804b-4dac0de9ab3d" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 563.714369] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquired lock "refresh_cache-3a7076f4-fc00-4f82-804b-4dac0de9ab3d" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 563.714369] env[60548]: DEBUG nova.network.neutron [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 563.781600] env[60548]: DEBUG nova.compute.manager [req-ead3d0e1-8ec3-4657-8995-bcb6d76cf3ce req-620dfba5-0c2f-445c-ba63-aafa2d48f675 service nova] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Received event network-vif-plugged-610d3091-1314-4a6e-94f4-ccfcebc1cc01 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 563.781830] env[60548]: DEBUG oslo_concurrency.lockutils [req-ead3d0e1-8ec3-4657-8995-bcb6d76cf3ce req-620dfba5-0c2f-445c-ba63-aafa2d48f675 service nova] Acquiring lock "31e48a76-ffbc-4bd2-a01f-2a69df2de5f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 563.782098] env[60548]: DEBUG oslo_concurrency.lockutils [req-ead3d0e1-8ec3-4657-8995-bcb6d76cf3ce req-620dfba5-0c2f-445c-ba63-aafa2d48f675 service nova] Lock "31e48a76-ffbc-4bd2-a01f-2a69df2de5f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 563.782204] env[60548]: DEBUG oslo_concurrency.lockutils [req-ead3d0e1-8ec3-4657-8995-bcb6d76cf3ce req-620dfba5-0c2f-445c-ba63-aafa2d48f675 service nova] Lock "31e48a76-ffbc-4bd2-a01f-2a69df2de5f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 563.782368] env[60548]: DEBUG nova.compute.manager [req-ead3d0e1-8ec3-4657-8995-bcb6d76cf3ce req-620dfba5-0c2f-445c-ba63-aafa2d48f675 service nova] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] No waiting events found dispatching network-vif-plugged-610d3091-1314-4a6e-94f4-ccfcebc1cc01 {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 563.782526] env[60548]: WARNING nova.compute.manager [req-ead3d0e1-8ec3-4657-8995-bcb6d76cf3ce req-620dfba5-0c2f-445c-ba63-aafa2d48f675 service nova] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Received unexpected event network-vif-plugged-610d3091-1314-4a6e-94f4-ccfcebc1cc01 for instance with vm_state building and task_state spawning. [ 563.814235] env[60548]: DEBUG nova.network.neutron [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 564.089331] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323283, 'name': CreateVM_Task, 'duration_secs': 0.4005} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 564.089510] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 564.090200] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 564.090351] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 564.090927] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 564.091190] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-de58ba89-1c0f-4a84-839b-0c9cbd3e4bf0 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.098043] env[60548]: DEBUG oslo_vmware.api [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Waiting for the task: (returnval){ [ 564.098043] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52158100-23ed-3d45-e61a-057e90d79328" [ 564.098043] env[60548]: _type = "Task" [ 564.098043] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 564.108781] env[60548]: DEBUG oslo_vmware.api [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52158100-23ed-3d45-e61a-057e90d79328, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 564.163008] env[60548]: DEBUG nova.compute.manager [req-cd7a8381-7a91-4a47-bdef-a784eec72922 req-5b5d8b2b-fe81-4735-a578-b6e7c6719685 service nova] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Received event network-vif-plugged-8bfce6a0-ed43-42e6-8285-8985812271fa {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 564.163292] env[60548]: DEBUG oslo_concurrency.lockutils [req-cd7a8381-7a91-4a47-bdef-a784eec72922 req-5b5d8b2b-fe81-4735-a578-b6e7c6719685 service nova] Acquiring lock "b974272a-5c32-4ed2-99db-1b1ac744d08c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 564.163537] env[60548]: DEBUG oslo_concurrency.lockutils [req-cd7a8381-7a91-4a47-bdef-a784eec72922 req-5b5d8b2b-fe81-4735-a578-b6e7c6719685 service nova] Lock "b974272a-5c32-4ed2-99db-1b1ac744d08c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 564.165083] env[60548]: DEBUG oslo_concurrency.lockutils [req-cd7a8381-7a91-4a47-bdef-a784eec72922 req-5b5d8b2b-fe81-4735-a578-b6e7c6719685 service nova] Lock "b974272a-5c32-4ed2-99db-1b1ac744d08c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 564.165083] env[60548]: DEBUG nova.compute.manager [req-cd7a8381-7a91-4a47-bdef-a784eec72922 req-5b5d8b2b-fe81-4735-a578-b6e7c6719685 service nova] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] No waiting events found dispatching network-vif-plugged-8bfce6a0-ed43-42e6-8285-8985812271fa {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 564.165083] env[60548]: WARNING nova.compute.manager [req-cd7a8381-7a91-4a47-bdef-a784eec72922 req-5b5d8b2b-fe81-4735-a578-b6e7c6719685 service nova] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Received unexpected event network-vif-plugged-8bfce6a0-ed43-42e6-8285-8985812271fa for instance with vm_state building and task_state spawning. [ 564.429544] env[60548]: DEBUG nova.network.neutron [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Updating instance_info_cache with network_info: [{"id": "fe9b78f9-9aa4-482f-8d36-4b5c359f7121", "address": "fa:16:3e:48:ea:20", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfe9b78f9-9a", "ovs_interfaceid": "fe9b78f9-9aa4-482f-8d36-4b5c359f7121", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 564.454996] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Releasing lock "refresh_cache-3a7076f4-fc00-4f82-804b-4dac0de9ab3d" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 564.454996] env[60548]: DEBUG nova.compute.manager [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Instance network_info: |[{"id": "fe9b78f9-9aa4-482f-8d36-4b5c359f7121", "address": "fa:16:3e:48:ea:20", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfe9b78f9-9a", "ovs_interfaceid": "fe9b78f9-9aa4-482f-8d36-4b5c359f7121", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 564.455221] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:48:ea:20', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '15922696-dc08-44ef-97be-0b09a9dfeae8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'fe9b78f9-9aa4-482f-8d36-4b5c359f7121', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 564.464162] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Creating folder: Project (8a91009401dd409ca662573757dfaf88). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 564.465352] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-91289938-7782-4019-9eff-30d964d56335 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.479925] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Created folder: Project (8a91009401dd409ca662573757dfaf88) in parent group-v850287. [ 564.479925] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Creating folder: Instances. Parent ref: group-v850297. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 564.479925] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f47f0ad5-d82b-455b-a1fe-35d6c0a2e321 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.490229] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Created folder: Instances in parent group-v850297. [ 564.490440] env[60548]: DEBUG oslo.service.loopingcall [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 564.490577] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 564.490783] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-be57f485-d1b3-4d9a-96f5-859cf1831120 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 564.512775] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 564.512775] env[60548]: value = "task-4323286" [ 564.512775] env[60548]: _type = "Task" [ 564.512775] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 564.521776] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323286, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 564.617495] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 564.617495] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 564.617495] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 565.028537] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323286, 'name': CreateVM_Task, 'duration_secs': 0.343913} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 565.028809] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 565.029484] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 565.029646] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 565.029958] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 565.030233] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3ea7bb06-1355-4356-ab08-e41fe11d2ecc {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 565.037211] env[60548]: DEBUG oslo_vmware.api [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Waiting for the task: (returnval){ [ 565.037211] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]5288d185-dc52-9703-f861-6fd2b6c5b20a" [ 565.037211] env[60548]: _type = "Task" [ 565.037211] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 565.047390] env[60548]: DEBUG oslo_vmware.api [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]5288d185-dc52-9703-f861-6fd2b6c5b20a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 565.549980] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 565.550739] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 565.550739] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 565.791992] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Acquiring lock "9a14b9d0-876b-45c6-825e-103caac6bef9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 565.792265] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Lock "9a14b9d0-876b-45c6-825e-103caac6bef9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 565.802237] env[60548]: DEBUG nova.compute.manager [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 565.865945] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 565.865945] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 565.867624] env[60548]: INFO nova.compute.claims [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 566.030255] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09cce09d-312c-4485-9caa-185befcbc150 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.040295] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd8a3df7-a273-468f-9f87-50b7523e5649 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.082230] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b84b5fb4-fe4b-4e60-bd2d-a25ae8f24ede {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.090806] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6989b403-0fce-441a-8b3c-5d3a14cde230 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.110799] env[60548]: DEBUG nova.compute.provider_tree [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 566.123343] env[60548]: DEBUG nova.scheduler.client.report [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 566.181084] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 566.181084] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 566.181084] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Starting heal instance info cache {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 566.181084] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Rebuilding the list of instances to heal {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 566.210556] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 566.210556] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 566.210556] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 566.210745] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 566.211080] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 566.211080] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Didn't find any instances for network info cache update. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 566.211640] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 566.211953] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 566.212319] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 566.212562] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 566.212794] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 566.213192] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 566.216628] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60548) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 566.217409] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 566.237821] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.372s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 566.238818] env[60548]: DEBUG nova.compute.manager [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 566.245076] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 566.245297] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 566.245454] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 566.245778] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60548) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 566.249301] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8f0653a-dded-486a-a2d8-1889f07e98c0 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.259134] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a297a15-a152-4c9d-81c1-1e8988e70772 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.289716] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-311bb426-1a7c-4964-9fda-ac436a6ad790 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.298920] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c0c1502-cc8c-4427-a5d5-b58a0cf107af {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.306533] env[60548]: DEBUG nova.compute.utils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 566.309355] env[60548]: DEBUG nova.compute.manager [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 566.309547] env[60548]: DEBUG nova.network.neutron [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 566.345896] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180704MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60548) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 566.346068] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 566.346343] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 566.349595] env[60548]: DEBUG nova.compute.manager [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 566.449867] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance b974272a-5c32-4ed2-99db-1b1ac744d08c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 566.450036] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 566.450160] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 3a7076f4-fc00-4f82-804b-4dac0de9ab3d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 566.450287] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance d76e8d11-53d3-417d-b6a6-08bdff8165d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 566.450401] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 9a14b9d0-876b-45c6-825e-103caac6bef9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 566.451839] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 566.451997] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=100GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 566.458622] env[60548]: DEBUG nova.compute.manager [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 566.488544] env[60548]: DEBUG nova.virt.hardware [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 566.488768] env[60548]: DEBUG nova.virt.hardware [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 566.488948] env[60548]: DEBUG nova.virt.hardware [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 566.490830] env[60548]: DEBUG nova.virt.hardware [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 566.490959] env[60548]: DEBUG nova.virt.hardware [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 566.491123] env[60548]: DEBUG nova.virt.hardware [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 566.491393] env[60548]: DEBUG nova.virt.hardware [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 566.491495] env[60548]: DEBUG nova.virt.hardware [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 566.491653] env[60548]: DEBUG nova.virt.hardware [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 566.491819] env[60548]: DEBUG nova.virt.hardware [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 566.492431] env[60548]: DEBUG nova.virt.hardware [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 566.493439] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f65ca2d3-a338-4f16-ad9a-02571d7a0f31 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.509709] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-673f924a-24c6-4d5c-9cf1-8163e69f19c9 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.564252] env[60548]: DEBUG nova.policy [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1e283359256141eda083eb00dd5ebbab', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1f23b8194c69430d893ed629fc9ba2c8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 566.582703] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-519aa0ee-7643-45d4-8b90-752339c8dc24 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.593027] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2beb00bb-5b22-452b-b8b2-3fb6c0fb3d6e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.625842] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbc86c78-a69a-4d71-815c-6634e8dcd2c5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.638840] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab29a085-f45f-446c-902f-a6e8a81c943e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.658892] env[60548]: DEBUG nova.compute.provider_tree [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 566.672442] env[60548]: DEBUG nova.scheduler.client.report [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 566.683180] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Acquiring lock "b2a1b0c6-91cd-437d-b5bf-ca618b008c23" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 566.683497] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Lock "b2a1b0c6-91cd-437d-b5bf-ca618b008c23" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 566.698086] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60548) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 566.698086] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.351s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 566.698086] env[60548]: DEBUG nova.compute.manager [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 566.763571] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 566.763571] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 566.765289] env[60548]: INFO nova.compute.claims [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 566.945129] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8276d07-979f-479e-8c04-71bbbb8498d0 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.961414] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-883d24fd-298e-4d32-8577-2e5882d72230 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.969361] env[60548]: DEBUG nova.compute.manager [req-f4850307-c76f-46ef-bb42-fea6c2493ea8 req-75a97207-8af0-47d0-b8be-d9cbe42e1265 service nova] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Received event network-changed-610d3091-1314-4a6e-94f4-ccfcebc1cc01 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 566.969361] env[60548]: DEBUG nova.compute.manager [req-f4850307-c76f-46ef-bb42-fea6c2493ea8 req-75a97207-8af0-47d0-b8be-d9cbe42e1265 service nova] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Refreshing instance network info cache due to event network-changed-610d3091-1314-4a6e-94f4-ccfcebc1cc01. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 566.969361] env[60548]: DEBUG oslo_concurrency.lockutils [req-f4850307-c76f-46ef-bb42-fea6c2493ea8 req-75a97207-8af0-47d0-b8be-d9cbe42e1265 service nova] Acquiring lock "refresh_cache-31e48a76-ffbc-4bd2-a01f-2a69df2de5f5" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 566.969361] env[60548]: DEBUG oslo_concurrency.lockutils [req-f4850307-c76f-46ef-bb42-fea6c2493ea8 req-75a97207-8af0-47d0-b8be-d9cbe42e1265 service nova] Acquired lock "refresh_cache-31e48a76-ffbc-4bd2-a01f-2a69df2de5f5" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 566.969361] env[60548]: DEBUG nova.network.neutron [req-f4850307-c76f-46ef-bb42-fea6c2493ea8 req-75a97207-8af0-47d0-b8be-d9cbe42e1265 service nova] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Refreshing network info cache for port 610d3091-1314-4a6e-94f4-ccfcebc1cc01 {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 567.007639] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d0de8d3-1cf0-4926-b81c-43a2d521af08 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 567.019614] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08c72ce3-9473-403d-a84e-6b4c568b759c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 567.029294] env[60548]: DEBUG nova.compute.manager [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Received event network-changed-8bfce6a0-ed43-42e6-8285-8985812271fa {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 567.029473] env[60548]: DEBUG nova.compute.manager [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Refreshing instance network info cache due to event network-changed-8bfce6a0-ed43-42e6-8285-8985812271fa. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 567.029679] env[60548]: DEBUG oslo_concurrency.lockutils [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] Acquiring lock "refresh_cache-b974272a-5c32-4ed2-99db-1b1ac744d08c" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 567.029875] env[60548]: DEBUG oslo_concurrency.lockutils [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] Acquired lock "refresh_cache-b974272a-5c32-4ed2-99db-1b1ac744d08c" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 567.029968] env[60548]: DEBUG nova.network.neutron [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Refreshing network info cache for port 8bfce6a0-ed43-42e6-8285-8985812271fa {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 567.040427] env[60548]: DEBUG nova.compute.provider_tree [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 567.058126] env[60548]: DEBUG nova.scheduler.client.report [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 567.079699] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.316s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 567.080222] env[60548]: DEBUG nova.compute.manager [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 567.123286] env[60548]: DEBUG nova.compute.utils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 567.128020] env[60548]: DEBUG nova.compute.manager [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Not allocating networking since 'none' was specified. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 567.141464] env[60548]: DEBUG nova.compute.manager [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 567.231695] env[60548]: DEBUG nova.compute.manager [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 567.262864] env[60548]: DEBUG nova.virt.hardware [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 567.263137] env[60548]: DEBUG nova.virt.hardware [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 567.263288] env[60548]: DEBUG nova.virt.hardware [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 567.263458] env[60548]: DEBUG nova.virt.hardware [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 567.263595] env[60548]: DEBUG nova.virt.hardware [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 567.263732] env[60548]: DEBUG nova.virt.hardware [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 567.263979] env[60548]: DEBUG nova.virt.hardware [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 567.268109] env[60548]: DEBUG nova.virt.hardware [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 567.268415] env[60548]: DEBUG nova.virt.hardware [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 567.272071] env[60548]: DEBUG nova.virt.hardware [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 567.272277] env[60548]: DEBUG nova.virt.hardware [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 567.273517] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c33e02d1-7be5-42f7-9621-15809ddf34c6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 567.290801] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c81e9a18-277f-466c-9774-8b52138944dc {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 567.324240] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Instance VIF info [] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 567.330210] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Creating folder: Project (856c0560e661462ab33a15624adf3b15). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 567.330659] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c6c8c6df-3ff7-42b1-aa1e-db90558b2071 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 567.344500] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Created folder: Project (856c0560e661462ab33a15624adf3b15) in parent group-v850287. [ 567.345822] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Creating folder: Instances. Parent ref: group-v850300. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 567.345822] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5c21bda8-79a4-4bea-b09e-55d711a34c2a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 567.356175] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Created folder: Instances in parent group-v850300. [ 567.356629] env[60548]: DEBUG oslo.service.loopingcall [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 567.357575] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 567.357575] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-5d377cbe-2750-4abe-9418-832f8e7b79b5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 567.375921] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 567.375921] env[60548]: value = "task-4323289" [ 567.375921] env[60548]: _type = "Task" [ 567.375921] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 567.384957] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323289, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 567.867913] env[60548]: DEBUG nova.network.neutron [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Successfully created port: a0727f54-5f66-4fd8-9aad-2622e897112d {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 567.893116] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323289, 'name': CreateVM_Task, 'duration_secs': 0.28412} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 567.893116] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 567.893116] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 567.893116] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 567.893116] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 567.893459] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b9df02e3-e2c1-4456-9127-58a252b0b813 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 567.900544] env[60548]: DEBUG oslo_vmware.api [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Waiting for the task: (returnval){ [ 567.900544] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]525bda7f-ea15-2bd7-8232-20b2bb015335" [ 567.900544] env[60548]: _type = "Task" [ 567.900544] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 567.911052] env[60548]: DEBUG oslo_vmware.api [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]525bda7f-ea15-2bd7-8232-20b2bb015335, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 568.366994] env[60548]: DEBUG nova.network.neutron [req-f4850307-c76f-46ef-bb42-fea6c2493ea8 req-75a97207-8af0-47d0-b8be-d9cbe42e1265 service nova] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Updated VIF entry in instance network info cache for port 610d3091-1314-4a6e-94f4-ccfcebc1cc01. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 568.367933] env[60548]: DEBUG nova.network.neutron [req-f4850307-c76f-46ef-bb42-fea6c2493ea8 req-75a97207-8af0-47d0-b8be-d9cbe42e1265 service nova] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Updating instance_info_cache with network_info: [{"id": "610d3091-1314-4a6e-94f4-ccfcebc1cc01", "address": "fa:16:3e:96:ed:0e", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap610d3091-13", "ovs_interfaceid": "610d3091-1314-4a6e-94f4-ccfcebc1cc01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 568.379472] env[60548]: DEBUG oslo_concurrency.lockutils [req-f4850307-c76f-46ef-bb42-fea6c2493ea8 req-75a97207-8af0-47d0-b8be-d9cbe42e1265 service nova] Releasing lock "refresh_cache-31e48a76-ffbc-4bd2-a01f-2a69df2de5f5" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 568.413909] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 568.415449] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 568.415449] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 568.870437] env[60548]: DEBUG nova.network.neutron [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Updated VIF entry in instance network info cache for port 8bfce6a0-ed43-42e6-8285-8985812271fa. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 568.870437] env[60548]: DEBUG nova.network.neutron [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Updating instance_info_cache with network_info: [{"id": "8bfce6a0-ed43-42e6-8285-8985812271fa", "address": "fa:16:3e:6c:ce:b5", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.59", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8bfce6a0-ed", "ovs_interfaceid": "8bfce6a0-ed43-42e6-8285-8985812271fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 568.884136] env[60548]: DEBUG oslo_concurrency.lockutils [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] Releasing lock "refresh_cache-b974272a-5c32-4ed2-99db-1b1ac744d08c" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 568.884538] env[60548]: DEBUG nova.compute.manager [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Received event network-vif-plugged-fe9b78f9-9aa4-482f-8d36-4b5c359f7121 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 568.884733] env[60548]: DEBUG oslo_concurrency.lockutils [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] Acquiring lock "3a7076f4-fc00-4f82-804b-4dac0de9ab3d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 568.884922] env[60548]: DEBUG oslo_concurrency.lockutils [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] Lock "3a7076f4-fc00-4f82-804b-4dac0de9ab3d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 568.885102] env[60548]: DEBUG oslo_concurrency.lockutils [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] Lock "3a7076f4-fc00-4f82-804b-4dac0de9ab3d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 568.885261] env[60548]: DEBUG nova.compute.manager [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] No waiting events found dispatching network-vif-plugged-fe9b78f9-9aa4-482f-8d36-4b5c359f7121 {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 568.885418] env[60548]: WARNING nova.compute.manager [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Received unexpected event network-vif-plugged-fe9b78f9-9aa4-482f-8d36-4b5c359f7121 for instance with vm_state building and task_state spawning. [ 568.885567] env[60548]: DEBUG nova.compute.manager [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Received event network-changed-fe9b78f9-9aa4-482f-8d36-4b5c359f7121 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 568.885708] env[60548]: DEBUG nova.compute.manager [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Refreshing instance network info cache due to event network-changed-fe9b78f9-9aa4-482f-8d36-4b5c359f7121. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 568.885901] env[60548]: DEBUG oslo_concurrency.lockutils [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] Acquiring lock "refresh_cache-3a7076f4-fc00-4f82-804b-4dac0de9ab3d" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 568.886019] env[60548]: DEBUG oslo_concurrency.lockutils [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] Acquired lock "refresh_cache-3a7076f4-fc00-4f82-804b-4dac0de9ab3d" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 568.890344] env[60548]: DEBUG nova.network.neutron [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Refreshing network info cache for port fe9b78f9-9aa4-482f-8d36-4b5c359f7121 {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 570.444195] env[60548]: DEBUG nova.network.neutron [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Updated VIF entry in instance network info cache for port fe9b78f9-9aa4-482f-8d36-4b5c359f7121. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 570.444195] env[60548]: DEBUG nova.network.neutron [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Updating instance_info_cache with network_info: [{"id": "fe9b78f9-9aa4-482f-8d36-4b5c359f7121", "address": "fa:16:3e:48:ea:20", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfe9b78f9-9a", "ovs_interfaceid": "fe9b78f9-9aa4-482f-8d36-4b5c359f7121", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 570.463919] env[60548]: DEBUG oslo_concurrency.lockutils [req-a9c70be6-10ad-4975-8687-876566a6f392 req-bb4636b2-530a-422a-b665-1bca5900a9dc service nova] Releasing lock "refresh_cache-3a7076f4-fc00-4f82-804b-4dac0de9ab3d" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 571.181738] env[60548]: DEBUG nova.network.neutron [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Successfully updated port: a0727f54-5f66-4fd8-9aad-2622e897112d {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 571.199012] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Acquiring lock "refresh_cache-9a14b9d0-876b-45c6-825e-103caac6bef9" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 571.199217] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Acquired lock "refresh_cache-9a14b9d0-876b-45c6-825e-103caac6bef9" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 571.199347] env[60548]: DEBUG nova.network.neutron [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 571.350154] env[60548]: DEBUG nova.network.neutron [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 572.642949] env[60548]: DEBUG nova.network.neutron [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Updating instance_info_cache with network_info: [{"id": "a0727f54-5f66-4fd8-9aad-2622e897112d", "address": "fa:16:3e:8a:2f:24", "network": {"id": "45ed8df3-a734-40b4-98bc-3f35817f2be2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1462603828-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1f23b8194c69430d893ed629fc9ba2c8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d4ef133-b6f3-41d1-add4-92a1482195cf", "external-id": "nsx-vlan-transportzone-446", "segmentation_id": 446, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa0727f54-5f", "ovs_interfaceid": "a0727f54-5f66-4fd8-9aad-2622e897112d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 572.660669] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Releasing lock "refresh_cache-9a14b9d0-876b-45c6-825e-103caac6bef9" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 572.661266] env[60548]: DEBUG nova.compute.manager [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Instance network_info: |[{"id": "a0727f54-5f66-4fd8-9aad-2622e897112d", "address": "fa:16:3e:8a:2f:24", "network": {"id": "45ed8df3-a734-40b4-98bc-3f35817f2be2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1462603828-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1f23b8194c69430d893ed629fc9ba2c8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d4ef133-b6f3-41d1-add4-92a1482195cf", "external-id": "nsx-vlan-transportzone-446", "segmentation_id": 446, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa0727f54-5f", "ovs_interfaceid": "a0727f54-5f66-4fd8-9aad-2622e897112d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 572.662434] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8a:2f:24', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6d4ef133-b6f3-41d1-add4-92a1482195cf', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a0727f54-5f66-4fd8-9aad-2622e897112d', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 572.675720] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Creating folder: Project (1f23b8194c69430d893ed629fc9ba2c8). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 572.677142] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a8244b25-85c0-4b88-bcd8-6463ab3eb377 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.689926] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Created folder: Project (1f23b8194c69430d893ed629fc9ba2c8) in parent group-v850287. [ 572.690597] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Creating folder: Instances. Parent ref: group-v850303. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 572.691229] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9a004552-4159-46cc-bf24-cf5bb59a8b1e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.701516] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Created folder: Instances in parent group-v850303. [ 573.702094] env[60548]: DEBUG oslo.service.loopingcall [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 573.702094] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 573.702804] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-97a9c35e-b861-44cc-ba5a-6c679bcc2e06 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.722432] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 573.722432] env[60548]: value = "task-4323292" [ 573.722432] env[60548]: _type = "Task" [ 573.722432] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 573.735019] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323292, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 573.902462] env[60548]: DEBUG nova.compute.manager [req-9ac6d542-6fb3-4cca-8311-c750e8a32f2f req-cb5d8005-4548-44b1-94c0-e8da6f0c1608 service nova] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Received event network-vif-plugged-a0727f54-5f66-4fd8-9aad-2622e897112d {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 573.902677] env[60548]: DEBUG oslo_concurrency.lockutils [req-9ac6d542-6fb3-4cca-8311-c750e8a32f2f req-cb5d8005-4548-44b1-94c0-e8da6f0c1608 service nova] Acquiring lock "9a14b9d0-876b-45c6-825e-103caac6bef9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 573.902866] env[60548]: DEBUG oslo_concurrency.lockutils [req-9ac6d542-6fb3-4cca-8311-c750e8a32f2f req-cb5d8005-4548-44b1-94c0-e8da6f0c1608 service nova] Lock "9a14b9d0-876b-45c6-825e-103caac6bef9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 573.903035] env[60548]: DEBUG oslo_concurrency.lockutils [req-9ac6d542-6fb3-4cca-8311-c750e8a32f2f req-cb5d8005-4548-44b1-94c0-e8da6f0c1608 service nova] Lock "9a14b9d0-876b-45c6-825e-103caac6bef9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 573.903198] env[60548]: DEBUG nova.compute.manager [req-9ac6d542-6fb3-4cca-8311-c750e8a32f2f req-cb5d8005-4548-44b1-94c0-e8da6f0c1608 service nova] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] No waiting events found dispatching network-vif-plugged-a0727f54-5f66-4fd8-9aad-2622e897112d {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 573.903353] env[60548]: WARNING nova.compute.manager [req-9ac6d542-6fb3-4cca-8311-c750e8a32f2f req-cb5d8005-4548-44b1-94c0-e8da6f0c1608 service nova] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Received unexpected event network-vif-plugged-a0727f54-5f66-4fd8-9aad-2622e897112d for instance with vm_state building and task_state spawning. [ 573.914194] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Acquiring lock "b14d8d3f-9253-4e5f-a5b8-09a2f43888a6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 573.915311] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Lock "b14d8d3f-9253-4e5f-a5b8-09a2f43888a6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 573.928802] env[60548]: DEBUG nova.compute.manager [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 573.992064] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 573.992381] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 573.994636] env[60548]: INFO nova.compute.claims [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 574.192772] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43a419fd-22b3-4812-8b18-bcff42445162 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.199732] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ea1666a-5101-445d-946d-82598bfb9ce0 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.242020] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c6ca188-aa86-41a6-84d7-906e39b8dd38 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.250693] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323292, 'name': CreateVM_Task, 'duration_secs': 0.311421} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 574.253712] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 574.253712] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 574.253712] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 574.254111] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 574.255418] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5dccca1c-199a-45ea-ae15-cd1eba138525 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.259418] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9fdd08af-7fee-4cb5-8611-9bb97455fa34 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.264844] env[60548]: DEBUG oslo_vmware.api [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Waiting for the task: (returnval){ [ 574.264844] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52acb4e6-06a9-9ffc-32c0-17c661bcff25" [ 574.264844] env[60548]: _type = "Task" [ 574.264844] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 574.273108] env[60548]: DEBUG nova.compute.provider_tree [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 574.287472] env[60548]: DEBUG oslo_vmware.api [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52acb4e6-06a9-9ffc-32c0-17c661bcff25, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 574.290842] env[60548]: DEBUG nova.scheduler.client.report [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 574.318415] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.326s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 574.318921] env[60548]: DEBUG nova.compute.manager [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 574.369806] env[60548]: DEBUG nova.compute.utils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 574.372317] env[60548]: DEBUG nova.compute.manager [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 574.372697] env[60548]: DEBUG nova.network.neutron [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 574.386444] env[60548]: DEBUG nova.compute.manager [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 574.505103] env[60548]: DEBUG nova.compute.manager [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 574.530864] env[60548]: DEBUG nova.virt.hardware [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 574.532181] env[60548]: DEBUG nova.virt.hardware [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 574.532307] env[60548]: DEBUG nova.virt.hardware [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 574.532509] env[60548]: DEBUG nova.virt.hardware [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 574.532648] env[60548]: DEBUG nova.virt.hardware [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 574.532787] env[60548]: DEBUG nova.virt.hardware [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 574.533099] env[60548]: DEBUG nova.virt.hardware [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 574.535821] env[60548]: DEBUG nova.virt.hardware [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 574.535924] env[60548]: DEBUG nova.virt.hardware [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 574.536113] env[60548]: DEBUG nova.virt.hardware [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 574.536526] env[60548]: DEBUG nova.virt.hardware [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 574.537402] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b470ebca-90c3-4feb-aa83-d889aba82d8b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.547859] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4e770dd-5cf5-4717-95bb-98fcfb2429a9 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.676553] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Acquiring lock "386edc81-5f27-4e44-af7a-f5e47ded1327" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 574.677297] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Lock "386edc81-5f27-4e44-af7a-f5e47ded1327" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 574.688913] env[60548]: DEBUG nova.compute.manager [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 574.746148] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 574.746474] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 574.748648] env[60548]: INFO nova.compute.claims [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 574.787550] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 574.787550] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 574.787550] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 574.816921] env[60548]: DEBUG nova.policy [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a5a7c5287174e1785ed2db5b3fec2a6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8494da0659724fdfa3b4021c74cd6897', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 574.960132] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d20caf30-2926-430b-9bc5-06778651dcf0 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.972008] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af6a9027-183c-4398-9ba3-5f8ebf4bf078 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.003364] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3b27434-36a8-42ef-aace-9cf156e48455 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.011216] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-671fe903-4d89-4a42-a1dc-b321642e6455 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.027412] env[60548]: DEBUG nova.compute.provider_tree [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 575.036147] env[60548]: DEBUG nova.scheduler.client.report [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 575.056560] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.310s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 575.057102] env[60548]: DEBUG nova.compute.manager [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 575.105138] env[60548]: DEBUG nova.compute.utils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 575.106716] env[60548]: DEBUG nova.compute.manager [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 575.106933] env[60548]: DEBUG nova.network.neutron [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 575.124907] env[60548]: DEBUG nova.compute.manager [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 575.216836] env[60548]: DEBUG nova.compute.manager [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 575.264115] env[60548]: DEBUG nova.virt.hardware [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 575.264385] env[60548]: DEBUG nova.virt.hardware [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 575.266040] env[60548]: DEBUG nova.virt.hardware [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 575.266040] env[60548]: DEBUG nova.virt.hardware [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 575.266040] env[60548]: DEBUG nova.virt.hardware [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 575.269058] env[60548]: DEBUG nova.virt.hardware [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 575.269474] env[60548]: DEBUG nova.virt.hardware [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 575.269718] env[60548]: DEBUG nova.virt.hardware [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 575.269986] env[60548]: DEBUG nova.virt.hardware [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 575.270462] env[60548]: DEBUG nova.virt.hardware [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 575.270462] env[60548]: DEBUG nova.virt.hardware [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 575.271680] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2d126e8-fd69-410e-a89a-74385bf8e283 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.290783] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e783d78b-50a2-4d23-8eac-10874733b33c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.604570] env[60548]: DEBUG nova.policy [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a192e518ab934e77befe2a60bac042ef', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '54433cd0d39244edab2bd1e18d2ffe3c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 576.546810] env[60548]: DEBUG nova.network.neutron [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Successfully created port: 0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1 {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 578.125385] env[60548]: DEBUG nova.network.neutron [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Successfully created port: f8f975b2-da5f-457b-a384-722ba5ac0720 {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 578.328225] env[60548]: DEBUG nova.network.neutron [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Successfully created port: 7b5f3162-eeac-4082-8341-f6d5131748a6 {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 579.162946] env[60548]: DEBUG nova.compute.manager [req-48360880-9d7a-4d6a-a81c-b9fecfb55b6f req-291ead89-3703-4b37-b117-df4b447b87af service nova] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Received event network-changed-a0727f54-5f66-4fd8-9aad-2622e897112d {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 579.163616] env[60548]: DEBUG nova.compute.manager [req-48360880-9d7a-4d6a-a81c-b9fecfb55b6f req-291ead89-3703-4b37-b117-df4b447b87af service nova] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Refreshing instance network info cache due to event network-changed-a0727f54-5f66-4fd8-9aad-2622e897112d. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 579.164145] env[60548]: DEBUG oslo_concurrency.lockutils [req-48360880-9d7a-4d6a-a81c-b9fecfb55b6f req-291ead89-3703-4b37-b117-df4b447b87af service nova] Acquiring lock "refresh_cache-9a14b9d0-876b-45c6-825e-103caac6bef9" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 579.164203] env[60548]: DEBUG oslo_concurrency.lockutils [req-48360880-9d7a-4d6a-a81c-b9fecfb55b6f req-291ead89-3703-4b37-b117-df4b447b87af service nova] Acquired lock "refresh_cache-9a14b9d0-876b-45c6-825e-103caac6bef9" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 579.164379] env[60548]: DEBUG nova.network.neutron [req-48360880-9d7a-4d6a-a81c-b9fecfb55b6f req-291ead89-3703-4b37-b117-df4b447b87af service nova] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Refreshing network info cache for port a0727f54-5f66-4fd8-9aad-2622e897112d {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 580.880891] env[60548]: DEBUG nova.network.neutron [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Successfully created port: 81746aea-a712-4714-a0ac-7b0d021469c7 {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 581.303720] env[60548]: DEBUG nova.network.neutron [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Successfully updated port: 7b5f3162-eeac-4082-8341-f6d5131748a6 {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 581.322030] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Acquiring lock "refresh_cache-386edc81-5f27-4e44-af7a-f5e47ded1327" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 581.322030] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Acquired lock "refresh_cache-386edc81-5f27-4e44-af7a-f5e47ded1327" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 581.322030] env[60548]: DEBUG nova.network.neutron [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 581.325708] env[60548]: DEBUG nova.network.neutron [req-48360880-9d7a-4d6a-a81c-b9fecfb55b6f req-291ead89-3703-4b37-b117-df4b447b87af service nova] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Updated VIF entry in instance network info cache for port a0727f54-5f66-4fd8-9aad-2622e897112d. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 581.326040] env[60548]: DEBUG nova.network.neutron [req-48360880-9d7a-4d6a-a81c-b9fecfb55b6f req-291ead89-3703-4b37-b117-df4b447b87af service nova] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Updating instance_info_cache with network_info: [{"id": "a0727f54-5f66-4fd8-9aad-2622e897112d", "address": "fa:16:3e:8a:2f:24", "network": {"id": "45ed8df3-a734-40b4-98bc-3f35817f2be2", "bridge": "br-int", "label": "tempest-ImagesTestJSON-1462603828-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1f23b8194c69430d893ed629fc9ba2c8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d4ef133-b6f3-41d1-add4-92a1482195cf", "external-id": "nsx-vlan-transportzone-446", "segmentation_id": 446, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa0727f54-5f", "ovs_interfaceid": "a0727f54-5f66-4fd8-9aad-2622e897112d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 581.340515] env[60548]: DEBUG oslo_concurrency.lockutils [req-48360880-9d7a-4d6a-a81c-b9fecfb55b6f req-291ead89-3703-4b37-b117-df4b447b87af service nova] Releasing lock "refresh_cache-9a14b9d0-876b-45c6-825e-103caac6bef9" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 581.744832] env[60548]: DEBUG nova.network.neutron [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 582.950408] env[60548]: DEBUG nova.network.neutron [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Updating instance_info_cache with network_info: [{"id": "7b5f3162-eeac-4082-8341-f6d5131748a6", "address": "fa:16:3e:b0:a6:4b", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.94", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7b5f3162-ee", "ovs_interfaceid": "7b5f3162-eeac-4082-8341-f6d5131748a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 582.964654] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Releasing lock "refresh_cache-386edc81-5f27-4e44-af7a-f5e47ded1327" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 582.964991] env[60548]: DEBUG nova.compute.manager [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Instance network_info: |[{"id": "7b5f3162-eeac-4082-8341-f6d5131748a6", "address": "fa:16:3e:b0:a6:4b", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.94", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7b5f3162-ee", "ovs_interfaceid": "7b5f3162-eeac-4082-8341-f6d5131748a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 582.965749] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b0:a6:4b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '15922696-dc08-44ef-97be-0b09a9dfeae8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7b5f3162-eeac-4082-8341-f6d5131748a6', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 582.975655] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Creating folder: Project (54433cd0d39244edab2bd1e18d2ffe3c). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 582.975907] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1030e361-dbff-4096-8a27-e08d7fdf7f5d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.989884] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Created folder: Project (54433cd0d39244edab2bd1e18d2ffe3c) in parent group-v850287. [ 582.990662] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Creating folder: Instances. Parent ref: group-v850306. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 582.990662] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4ece2361-06e1-4f78-8bce-235698ea4ddc {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.000824] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Created folder: Instances in parent group-v850306. [ 583.001092] env[60548]: DEBUG oslo.service.loopingcall [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 583.001322] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 583.001605] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6922b2fa-4b7f-4d7f-9dd9-b51cbecaebf6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.023055] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 583.023055] env[60548]: value = "task-4323295" [ 583.023055] env[60548]: _type = "Task" [ 583.023055] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 583.032916] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323295, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 583.537917] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323295, 'name': CreateVM_Task} progress is 25%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 584.037844] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323295, 'name': CreateVM_Task} progress is 25%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 584.206750] env[60548]: DEBUG nova.compute.manager [req-a42085c8-b08b-4e84-b81a-d48494cb7502 req-26223a6b-8606-415b-ae66-d8cb2b7b4816 service nova] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Received event network-vif-plugged-7b5f3162-eeac-4082-8341-f6d5131748a6 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 584.207326] env[60548]: DEBUG oslo_concurrency.lockutils [req-a42085c8-b08b-4e84-b81a-d48494cb7502 req-26223a6b-8606-415b-ae66-d8cb2b7b4816 service nova] Acquiring lock "386edc81-5f27-4e44-af7a-f5e47ded1327-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.215147] env[60548]: DEBUG oslo_concurrency.lockutils [req-a42085c8-b08b-4e84-b81a-d48494cb7502 req-26223a6b-8606-415b-ae66-d8cb2b7b4816 service nova] Lock "386edc81-5f27-4e44-af7a-f5e47ded1327-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.215147] env[60548]: DEBUG oslo_concurrency.lockutils [req-a42085c8-b08b-4e84-b81a-d48494cb7502 req-26223a6b-8606-415b-ae66-d8cb2b7b4816 service nova] Lock "386edc81-5f27-4e44-af7a-f5e47ded1327-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.005s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.215147] env[60548]: DEBUG nova.compute.manager [req-a42085c8-b08b-4e84-b81a-d48494cb7502 req-26223a6b-8606-415b-ae66-d8cb2b7b4816 service nova] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] No waiting events found dispatching network-vif-plugged-7b5f3162-eeac-4082-8341-f6d5131748a6 {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 584.215147] env[60548]: WARNING nova.compute.manager [req-a42085c8-b08b-4e84-b81a-d48494cb7502 req-26223a6b-8606-415b-ae66-d8cb2b7b4816 service nova] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Received unexpected event network-vif-plugged-7b5f3162-eeac-4082-8341-f6d5131748a6 for instance with vm_state building and task_state spawning. [ 584.536742] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323295, 'name': CreateVM_Task, 'duration_secs': 1.185342} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 584.537131] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 584.537914] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 584.538201] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 584.538633] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 584.538971] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a5f68e31-7384-4741-846b-594179eb4cf7 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.544533] env[60548]: DEBUG oslo_vmware.api [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Waiting for the task: (returnval){ [ 584.544533] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]5250f4df-9da2-5bfd-5d45-ce0a3410f8df" [ 584.544533] env[60548]: _type = "Task" [ 584.544533] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 584.553783] env[60548]: DEBUG oslo_vmware.api [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]5250f4df-9da2-5bfd-5d45-ce0a3410f8df, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 585.059521] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 585.060039] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 585.060925] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 585.805117] env[60548]: DEBUG nova.network.neutron [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Successfully updated port: 0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1 {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 588.440855] env[60548]: DEBUG nova.network.neutron [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Successfully updated port: f8f975b2-da5f-457b-a384-722ba5ac0720 {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 588.923477] env[60548]: DEBUG nova.compute.manager [req-9d65623c-90c9-4b40-9828-835c80d83190 req-0c99f393-9143-4a64-a4b1-c9e0d84ccbed service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Received event network-vif-plugged-0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 588.923663] env[60548]: DEBUG oslo_concurrency.lockutils [req-9d65623c-90c9-4b40-9828-835c80d83190 req-0c99f393-9143-4a64-a4b1-c9e0d84ccbed service nova] Acquiring lock "b14d8d3f-9253-4e5f-a5b8-09a2f43888a6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.923861] env[60548]: DEBUG oslo_concurrency.lockutils [req-9d65623c-90c9-4b40-9828-835c80d83190 req-0c99f393-9143-4a64-a4b1-c9e0d84ccbed service nova] Lock "b14d8d3f-9253-4e5f-a5b8-09a2f43888a6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.924047] env[60548]: DEBUG oslo_concurrency.lockutils [req-9d65623c-90c9-4b40-9828-835c80d83190 req-0c99f393-9143-4a64-a4b1-c9e0d84ccbed service nova] Lock "b14d8d3f-9253-4e5f-a5b8-09a2f43888a6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.924259] env[60548]: DEBUG nova.compute.manager [req-9d65623c-90c9-4b40-9828-835c80d83190 req-0c99f393-9143-4a64-a4b1-c9e0d84ccbed service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] No waiting events found dispatching network-vif-plugged-0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1 {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 588.924443] env[60548]: WARNING nova.compute.manager [req-9d65623c-90c9-4b40-9828-835c80d83190 req-0c99f393-9143-4a64-a4b1-c9e0d84ccbed service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Received unexpected event network-vif-plugged-0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1 for instance with vm_state building and task_state spawning. [ 589.532286] env[60548]: DEBUG nova.network.neutron [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Successfully updated port: 81746aea-a712-4714-a0ac-7b0d021469c7 {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 589.545778] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Acquiring lock "refresh_cache-b14d8d3f-9253-4e5f-a5b8-09a2f43888a6" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 589.545957] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Acquired lock "refresh_cache-b14d8d3f-9253-4e5f-a5b8-09a2f43888a6" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 589.546096] env[60548]: DEBUG nova.network.neutron [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 589.594029] env[60548]: DEBUG nova.compute.manager [req-a5cd178d-ae40-44a9-84d8-2c59c3b54176 req-7ac32495-ca64-4462-b26d-aa5556e1332c service nova] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Received event network-changed-7b5f3162-eeac-4082-8341-f6d5131748a6 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 589.594278] env[60548]: DEBUG nova.compute.manager [req-a5cd178d-ae40-44a9-84d8-2c59c3b54176 req-7ac32495-ca64-4462-b26d-aa5556e1332c service nova] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Refreshing instance network info cache due to event network-changed-7b5f3162-eeac-4082-8341-f6d5131748a6. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 589.594544] env[60548]: DEBUG oslo_concurrency.lockutils [req-a5cd178d-ae40-44a9-84d8-2c59c3b54176 req-7ac32495-ca64-4462-b26d-aa5556e1332c service nova] Acquiring lock "refresh_cache-386edc81-5f27-4e44-af7a-f5e47ded1327" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 589.594624] env[60548]: DEBUG oslo_concurrency.lockutils [req-a5cd178d-ae40-44a9-84d8-2c59c3b54176 req-7ac32495-ca64-4462-b26d-aa5556e1332c service nova] Acquired lock "refresh_cache-386edc81-5f27-4e44-af7a-f5e47ded1327" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 589.596386] env[60548]: DEBUG nova.network.neutron [req-a5cd178d-ae40-44a9-84d8-2c59c3b54176 req-7ac32495-ca64-4462-b26d-aa5556e1332c service nova] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Refreshing network info cache for port 7b5f3162-eeac-4082-8341-f6d5131748a6 {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 589.604280] env[60548]: DEBUG nova.network.neutron [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 590.928689] env[60548]: DEBUG nova.network.neutron [req-a5cd178d-ae40-44a9-84d8-2c59c3b54176 req-7ac32495-ca64-4462-b26d-aa5556e1332c service nova] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Updated VIF entry in instance network info cache for port 7b5f3162-eeac-4082-8341-f6d5131748a6. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 590.928987] env[60548]: DEBUG nova.network.neutron [req-a5cd178d-ae40-44a9-84d8-2c59c3b54176 req-7ac32495-ca64-4462-b26d-aa5556e1332c service nova] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Updating instance_info_cache with network_info: [{"id": "7b5f3162-eeac-4082-8341-f6d5131748a6", "address": "fa:16:3e:b0:a6:4b", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.94", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7b5f3162-ee", "ovs_interfaceid": "7b5f3162-eeac-4082-8341-f6d5131748a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 590.943503] env[60548]: DEBUG oslo_concurrency.lockutils [req-a5cd178d-ae40-44a9-84d8-2c59c3b54176 req-7ac32495-ca64-4462-b26d-aa5556e1332c service nova] Releasing lock "refresh_cache-386edc81-5f27-4e44-af7a-f5e47ded1327" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 591.872709] env[60548]: DEBUG nova.network.neutron [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Updating instance_info_cache with network_info: [{"id": "0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1", "address": "fa:16:3e:2a:3e:aa", "network": {"id": "50239ef5-d3f4-467d-8a7d-5ac0ddb27ad0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1857023818", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8494da0659724fdfa3b4021c74cd6897", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b67e519-46cf-44ce-b670-4ba4c0c5b658", "external-id": "nsx-vlan-transportzone-110", "segmentation_id": 110, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0cf84f4c-c2", "ovs_interfaceid": "0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f8f975b2-da5f-457b-a384-722ba5ac0720", "address": "fa:16:3e:b1:b3:01", "network": {"id": "f79008b6-120d-4b0e-9226-d5db9d1a2032", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-596521528", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.222", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "8494da0659724fdfa3b4021c74cd6897", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "849fc06e-dfc2-470f-8490-034590682ea7", "external-id": "nsx-vlan-transportzone-567", "segmentation_id": 567, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf8f975b2-da", "ovs_interfaceid": "f8f975b2-da5f-457b-a384-722ba5ac0720", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "81746aea-a712-4714-a0ac-7b0d021469c7", "address": "fa:16:3e:23:ca:3d", "network": {"id": "50239ef5-d3f4-467d-8a7d-5ac0ddb27ad0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1857023818", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.242", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8494da0659724fdfa3b4021c74cd6897", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b67e519-46cf-44ce-b670-4ba4c0c5b658", "external-id": "nsx-vlan-transportzone-110", "segmentation_id": 110, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap81746aea-a7", "ovs_interfaceid": "81746aea-a712-4714-a0ac-7b0d021469c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 591.898128] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Releasing lock "refresh_cache-b14d8d3f-9253-4e5f-a5b8-09a2f43888a6" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 591.898128] env[60548]: DEBUG nova.compute.manager [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Instance network_info: |[{"id": "0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1", "address": "fa:16:3e:2a:3e:aa", "network": {"id": "50239ef5-d3f4-467d-8a7d-5ac0ddb27ad0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1857023818", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8494da0659724fdfa3b4021c74cd6897", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b67e519-46cf-44ce-b670-4ba4c0c5b658", "external-id": "nsx-vlan-transportzone-110", "segmentation_id": 110, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0cf84f4c-c2", "ovs_interfaceid": "0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f8f975b2-da5f-457b-a384-722ba5ac0720", "address": "fa:16:3e:b1:b3:01", "network": {"id": "f79008b6-120d-4b0e-9226-d5db9d1a2032", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-596521528", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.222", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "8494da0659724fdfa3b4021c74cd6897", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "849fc06e-dfc2-470f-8490-034590682ea7", "external-id": "nsx-vlan-transportzone-567", "segmentation_id": 567, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf8f975b2-da", "ovs_interfaceid": "f8f975b2-da5f-457b-a384-722ba5ac0720", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "81746aea-a712-4714-a0ac-7b0d021469c7", "address": "fa:16:3e:23:ca:3d", "network": {"id": "50239ef5-d3f4-467d-8a7d-5ac0ddb27ad0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1857023818", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.242", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8494da0659724fdfa3b4021c74cd6897", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b67e519-46cf-44ce-b670-4ba4c0c5b658", "external-id": "nsx-vlan-transportzone-110", "segmentation_id": 110, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap81746aea-a7", "ovs_interfaceid": "81746aea-a712-4714-a0ac-7b0d021469c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 591.898128] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2a:3e:aa', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3b67e519-46cf-44ce-b670-4ba4c0c5b658', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:b1:b3:01', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '849fc06e-dfc2-470f-8490-034590682ea7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f8f975b2-da5f-457b-a384-722ba5ac0720', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:23:ca:3d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3b67e519-46cf-44ce-b670-4ba4c0c5b658', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '81746aea-a712-4714-a0ac-7b0d021469c7', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 591.911909] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Creating folder: Project (8494da0659724fdfa3b4021c74cd6897). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 591.912570] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d3f1e569-5d82-47d8-82c3-d03541501b35 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.925720] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Created folder: Project (8494da0659724fdfa3b4021c74cd6897) in parent group-v850287. [ 591.925930] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Creating folder: Instances. Parent ref: group-v850309. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 591.926148] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9e1a3e1a-3772-4b8f-8f38-629610f4e85c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.936141] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Created folder: Instances in parent group-v850309. [ 591.936397] env[60548]: DEBUG oslo.service.loopingcall [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 591.936639] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 591.936779] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3b798784-cfed-44e6-a6f7-bb68163b27e2 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 591.968486] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 591.968486] env[60548]: value = "task-4323298" [ 591.968486] env[60548]: _type = "Task" [ 591.968486] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 591.977598] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323298, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 592.482086] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323298, 'name': CreateVM_Task, 'duration_secs': 0.410046} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 592.482441] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 592.484397] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 592.484577] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 592.484971] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 592.485284] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-95a73217-95ed-40cd-ae5e-2542bdd5d641 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.491167] env[60548]: DEBUG oslo_vmware.api [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Waiting for the task: (returnval){ [ 592.491167] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52e0b89a-4410-63b9-4129-eb15b12279d6" [ 592.491167] env[60548]: _type = "Task" [ 592.491167] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 592.500536] env[60548]: DEBUG oslo_vmware.api [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52e0b89a-4410-63b9-4129-eb15b12279d6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 593.005238] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 593.005548] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 593.005759] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 594.096884] env[60548]: DEBUG nova.compute.manager [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Received event network-changed-0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 594.096884] env[60548]: DEBUG nova.compute.manager [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Refreshing instance network info cache due to event network-changed-0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 594.096884] env[60548]: DEBUG oslo_concurrency.lockutils [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] Acquiring lock "refresh_cache-b14d8d3f-9253-4e5f-a5b8-09a2f43888a6" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 594.097462] env[60548]: DEBUG oslo_concurrency.lockutils [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] Acquired lock "refresh_cache-b14d8d3f-9253-4e5f-a5b8-09a2f43888a6" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 594.097691] env[60548]: DEBUG nova.network.neutron [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Refreshing network info cache for port 0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1 {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 594.752545] env[60548]: DEBUG nova.network.neutron [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Updated VIF entry in instance network info cache for port 0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 594.752994] env[60548]: DEBUG nova.network.neutron [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Updating instance_info_cache with network_info: [{"id": "0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1", "address": "fa:16:3e:2a:3e:aa", "network": {"id": "50239ef5-d3f4-467d-8a7d-5ac0ddb27ad0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1857023818", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8494da0659724fdfa3b4021c74cd6897", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b67e519-46cf-44ce-b670-4ba4c0c5b658", "external-id": "nsx-vlan-transportzone-110", "segmentation_id": 110, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0cf84f4c-c2", "ovs_interfaceid": "0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f8f975b2-da5f-457b-a384-722ba5ac0720", "address": "fa:16:3e:b1:b3:01", "network": {"id": "f79008b6-120d-4b0e-9226-d5db9d1a2032", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-596521528", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.222", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "8494da0659724fdfa3b4021c74cd6897", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "849fc06e-dfc2-470f-8490-034590682ea7", "external-id": "nsx-vlan-transportzone-567", "segmentation_id": 567, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf8f975b2-da", "ovs_interfaceid": "f8f975b2-da5f-457b-a384-722ba5ac0720", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "81746aea-a712-4714-a0ac-7b0d021469c7", "address": "fa:16:3e:23:ca:3d", "network": {"id": "50239ef5-d3f4-467d-8a7d-5ac0ddb27ad0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1857023818", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.242", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8494da0659724fdfa3b4021c74cd6897", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b67e519-46cf-44ce-b670-4ba4c0c5b658", "external-id": "nsx-vlan-transportzone-110", "segmentation_id": 110, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap81746aea-a7", "ovs_interfaceid": "81746aea-a712-4714-a0ac-7b0d021469c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 594.763467] env[60548]: DEBUG oslo_concurrency.lockutils [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] Releasing lock "refresh_cache-b14d8d3f-9253-4e5f-a5b8-09a2f43888a6" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 594.763732] env[60548]: DEBUG nova.compute.manager [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Received event network-vif-plugged-f8f975b2-da5f-457b-a384-722ba5ac0720 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 594.763925] env[60548]: DEBUG oslo_concurrency.lockutils [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] Acquiring lock "b14d8d3f-9253-4e5f-a5b8-09a2f43888a6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 594.764161] env[60548]: DEBUG oslo_concurrency.lockutils [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] Lock "b14d8d3f-9253-4e5f-a5b8-09a2f43888a6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 594.764348] env[60548]: DEBUG oslo_concurrency.lockutils [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] Lock "b14d8d3f-9253-4e5f-a5b8-09a2f43888a6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 594.764519] env[60548]: DEBUG nova.compute.manager [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] No waiting events found dispatching network-vif-plugged-f8f975b2-da5f-457b-a384-722ba5ac0720 {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 594.764680] env[60548]: WARNING nova.compute.manager [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Received unexpected event network-vif-plugged-f8f975b2-da5f-457b-a384-722ba5ac0720 for instance with vm_state building and task_state spawning. [ 594.764833] env[60548]: DEBUG nova.compute.manager [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Received event network-changed-f8f975b2-da5f-457b-a384-722ba5ac0720 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 594.765431] env[60548]: DEBUG nova.compute.manager [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Refreshing instance network info cache due to event network-changed-f8f975b2-da5f-457b-a384-722ba5ac0720. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 594.765431] env[60548]: DEBUG oslo_concurrency.lockutils [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] Acquiring lock "refresh_cache-b14d8d3f-9253-4e5f-a5b8-09a2f43888a6" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 594.765431] env[60548]: DEBUG oslo_concurrency.lockutils [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] Acquired lock "refresh_cache-b14d8d3f-9253-4e5f-a5b8-09a2f43888a6" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 594.765578] env[60548]: DEBUG nova.network.neutron [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Refreshing network info cache for port f8f975b2-da5f-457b-a384-722ba5ac0720 {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 595.212660] env[60548]: DEBUG nova.network.neutron [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Updated VIF entry in instance network info cache for port f8f975b2-da5f-457b-a384-722ba5ac0720. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 595.213106] env[60548]: DEBUG nova.network.neutron [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Updating instance_info_cache with network_info: [{"id": "0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1", "address": "fa:16:3e:2a:3e:aa", "network": {"id": "50239ef5-d3f4-467d-8a7d-5ac0ddb27ad0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1857023818", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8494da0659724fdfa3b4021c74cd6897", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b67e519-46cf-44ce-b670-4ba4c0c5b658", "external-id": "nsx-vlan-transportzone-110", "segmentation_id": 110, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0cf84f4c-c2", "ovs_interfaceid": "0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f8f975b2-da5f-457b-a384-722ba5ac0720", "address": "fa:16:3e:b1:b3:01", "network": {"id": "f79008b6-120d-4b0e-9226-d5db9d1a2032", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-596521528", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.222", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "8494da0659724fdfa3b4021c74cd6897", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "849fc06e-dfc2-470f-8490-034590682ea7", "external-id": "nsx-vlan-transportzone-567", "segmentation_id": 567, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf8f975b2-da", "ovs_interfaceid": "f8f975b2-da5f-457b-a384-722ba5ac0720", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "81746aea-a712-4714-a0ac-7b0d021469c7", "address": "fa:16:3e:23:ca:3d", "network": {"id": "50239ef5-d3f4-467d-8a7d-5ac0ddb27ad0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1857023818", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.242", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8494da0659724fdfa3b4021c74cd6897", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b67e519-46cf-44ce-b670-4ba4c0c5b658", "external-id": "nsx-vlan-transportzone-110", "segmentation_id": 110, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap81746aea-a7", "ovs_interfaceid": "81746aea-a712-4714-a0ac-7b0d021469c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 595.225954] env[60548]: DEBUG oslo_concurrency.lockutils [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] Releasing lock "refresh_cache-b14d8d3f-9253-4e5f-a5b8-09a2f43888a6" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 595.226131] env[60548]: DEBUG nova.compute.manager [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Received event network-vif-plugged-81746aea-a712-4714-a0ac-7b0d021469c7 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 595.226400] env[60548]: DEBUG oslo_concurrency.lockutils [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] Acquiring lock "b14d8d3f-9253-4e5f-a5b8-09a2f43888a6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 595.226627] env[60548]: DEBUG oslo_concurrency.lockutils [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] Lock "b14d8d3f-9253-4e5f-a5b8-09a2f43888a6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 595.226783] env[60548]: DEBUG oslo_concurrency.lockutils [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] Lock "b14d8d3f-9253-4e5f-a5b8-09a2f43888a6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 595.226943] env[60548]: DEBUG nova.compute.manager [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] No waiting events found dispatching network-vif-plugged-81746aea-a712-4714-a0ac-7b0d021469c7 {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 595.227123] env[60548]: WARNING nova.compute.manager [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Received unexpected event network-vif-plugged-81746aea-a712-4714-a0ac-7b0d021469c7 for instance with vm_state building and task_state spawning. [ 595.227392] env[60548]: DEBUG nova.compute.manager [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Received event network-changed-81746aea-a712-4714-a0ac-7b0d021469c7 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 595.227469] env[60548]: DEBUG nova.compute.manager [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Refreshing instance network info cache due to event network-changed-81746aea-a712-4714-a0ac-7b0d021469c7. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 595.227591] env[60548]: DEBUG oslo_concurrency.lockutils [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] Acquiring lock "refresh_cache-b14d8d3f-9253-4e5f-a5b8-09a2f43888a6" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 595.227721] env[60548]: DEBUG oslo_concurrency.lockutils [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] Acquired lock "refresh_cache-b14d8d3f-9253-4e5f-a5b8-09a2f43888a6" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 595.227870] env[60548]: DEBUG nova.network.neutron [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Refreshing network info cache for port 81746aea-a712-4714-a0ac-7b0d021469c7 {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 595.733561] env[60548]: DEBUG nova.network.neutron [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Updated VIF entry in instance network info cache for port 81746aea-a712-4714-a0ac-7b0d021469c7. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 595.734286] env[60548]: DEBUG nova.network.neutron [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Updating instance_info_cache with network_info: [{"id": "0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1", "address": "fa:16:3e:2a:3e:aa", "network": {"id": "50239ef5-d3f4-467d-8a7d-5ac0ddb27ad0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1857023818", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.84", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8494da0659724fdfa3b4021c74cd6897", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b67e519-46cf-44ce-b670-4ba4c0c5b658", "external-id": "nsx-vlan-transportzone-110", "segmentation_id": 110, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0cf84f4c-c2", "ovs_interfaceid": "0cf84f4c-c2e3-4473-a7ce-eb6c1765bab1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f8f975b2-da5f-457b-a384-722ba5ac0720", "address": "fa:16:3e:b1:b3:01", "network": {"id": "f79008b6-120d-4b0e-9226-d5db9d1a2032", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-596521528", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.222", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "8494da0659724fdfa3b4021c74cd6897", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "849fc06e-dfc2-470f-8490-034590682ea7", "external-id": "nsx-vlan-transportzone-567", "segmentation_id": 567, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf8f975b2-da", "ovs_interfaceid": "f8f975b2-da5f-457b-a384-722ba5ac0720", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "81746aea-a712-4714-a0ac-7b0d021469c7", "address": "fa:16:3e:23:ca:3d", "network": {"id": "50239ef5-d3f4-467d-8a7d-5ac0ddb27ad0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1857023818", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.242", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8494da0659724fdfa3b4021c74cd6897", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3b67e519-46cf-44ce-b670-4ba4c0c5b658", "external-id": "nsx-vlan-transportzone-110", "segmentation_id": 110, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap81746aea-a7", "ovs_interfaceid": "81746aea-a712-4714-a0ac-7b0d021469c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 595.746883] env[60548]: DEBUG oslo_concurrency.lockutils [req-08118efb-09d6-4e69-ad5f-47ea6378b435 req-638c87e1-0e12-4ae7-a64e-ace270d68750 service nova] Releasing lock "refresh_cache-b14d8d3f-9253-4e5f-a5b8-09a2f43888a6" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 611.392987] env[60548]: WARNING oslo_vmware.rw_handles [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 611.392987] env[60548]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 611.392987] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 611.392987] env[60548]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 611.392987] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 611.392987] env[60548]: ERROR oslo_vmware.rw_handles response.begin() [ 611.392987] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 611.392987] env[60548]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 611.392987] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 611.392987] env[60548]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 611.392987] env[60548]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 611.392987] env[60548]: ERROR oslo_vmware.rw_handles [ 611.393841] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Downloaded image file data 5674e50f-0c0c-4f19-8379-104dac34660b to vmware_temp/f5a9fe2e-f6fc-4c3c-8902-89ba1d50383c/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 611.394814] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Caching image {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 611.395065] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Copying Virtual Disk [datastore1] vmware_temp/f5a9fe2e-f6fc-4c3c-8902-89ba1d50383c/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk to [datastore1] vmware_temp/f5a9fe2e-f6fc-4c3c-8902-89ba1d50383c/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk {{(pid=60548) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 611.395356] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-30f03109-ef09-49f9-8a5f-4473ea2ec4ef {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.405719] env[60548]: DEBUG oslo_vmware.api [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Waiting for the task: (returnval){ [ 611.405719] env[60548]: value = "task-4323299" [ 611.405719] env[60548]: _type = "Task" [ 611.405719] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 611.415107] env[60548]: DEBUG oslo_vmware.api [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Task: {'id': task-4323299, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 611.916429] env[60548]: DEBUG oslo_vmware.exceptions [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Fault InvalidArgument not matched. {{(pid=60548) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 611.916429] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 611.919992] env[60548]: ERROR nova.compute.manager [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 611.919992] env[60548]: Faults: ['InvalidArgument'] [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Traceback (most recent call last): [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] yield resources [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] self.driver.spawn(context, instance, image_meta, [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] self._fetch_image_if_missing(context, vi) [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] image_cache(vi, tmp_image_ds_loc) [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] vm_util.copy_virtual_disk( [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] session._wait_for_task(vmdk_copy_task) [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] return self.wait_for_task(task_ref) [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] return evt.wait() [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] result = hub.switch() [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] return self.greenlet.switch() [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] self.f(*self.args, **self.kw) [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] raise exceptions.translate_fault(task_info.error) [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Faults: ['InvalidArgument'] [ 611.919992] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] [ 611.921017] env[60548]: INFO nova.compute.manager [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Terminating instance [ 611.922377] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 611.922547] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 611.923362] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Acquiring lock "refresh_cache-d76e8d11-53d3-417d-b6a6-08bdff8165d5" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 611.923568] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Acquired lock "refresh_cache-d76e8d11-53d3-417d-b6a6-08bdff8165d5" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 611.923690] env[60548]: DEBUG nova.network.neutron [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 611.925120] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-194fafbc-c754-47ae-973b-2675f629667e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.936569] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 611.936784] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 611.938884] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-19a535ec-4035-43fa-a574-a315bddb1a92 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 611.948371] env[60548]: DEBUG oslo_vmware.api [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Waiting for the task: (returnval){ [ 611.948371] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52b46581-455a-c84d-05f4-85104b6d3a52" [ 611.948371] env[60548]: _type = "Task" [ 611.948371] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 611.958384] env[60548]: DEBUG oslo_vmware.api [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52b46581-455a-c84d-05f4-85104b6d3a52, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 612.074619] env[60548]: DEBUG nova.network.neutron [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 612.463021] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 612.463758] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Creating directory with path [datastore1] vmware_temp/2aabf739-246f-403f-9cdf-f912e8635074/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 612.464730] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-343c6b12-fdc2-4aea-985b-13fc4e664827 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.480498] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Created directory with path [datastore1] vmware_temp/2aabf739-246f-403f-9cdf-f912e8635074/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 612.480774] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Fetch image to [datastore1] vmware_temp/2aabf739-246f-403f-9cdf-f912e8635074/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 612.480978] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/2aabf739-246f-403f-9cdf-f912e8635074/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 612.482727] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-095f5501-7ca9-413f-b092-26f66706d2fb {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.492667] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f54d6210-cd13-4de2-885a-e72873e222c3 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.508337] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0ecc2f4-8dac-4fe9-b4fe-d1c102587655 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.549129] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60ea33b7-9ae6-481a-aee7-861f2f6603f2 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.556809] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b3b01b01-b647-4ff5-a4df-218bf256ec9a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.583335] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 612.653368] env[60548]: DEBUG oslo_vmware.rw_handles [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2aabf739-246f-403f-9cdf-f912e8635074/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 612.715785] env[60548]: DEBUG oslo_vmware.rw_handles [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Completed reading data from the image iterator. {{(pid=60548) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 612.715948] env[60548]: DEBUG oslo_vmware.rw_handles [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2aabf739-246f-403f-9cdf-f912e8635074/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 612.823580] env[60548]: DEBUG nova.network.neutron [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 612.835504] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Releasing lock "refresh_cache-d76e8d11-53d3-417d-b6a6-08bdff8165d5" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 612.835504] env[60548]: DEBUG nova.compute.manager [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 612.835504] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 612.835504] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf27d4ff-102b-4c6f-932d-352ae437af5a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.844470] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 612.844935] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9ece1560-c5a7-4ef5-9130-3c2f68da4d72 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.884137] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 612.884137] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 612.884137] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Deleting the datastore file [datastore1] d76e8d11-53d3-417d-b6a6-08bdff8165d5 {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 612.884137] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-080f6030-c492-4678-a274-bf89f29f1292 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 612.897898] env[60548]: DEBUG oslo_vmware.api [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Waiting for the task: (returnval){ [ 612.897898] env[60548]: value = "task-4323301" [ 612.897898] env[60548]: _type = "Task" [ 612.897898] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 612.908337] env[60548]: DEBUG oslo_vmware.api [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Task: {'id': task-4323301, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 613.406109] env[60548]: DEBUG oslo_vmware.api [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Task: {'id': task-4323301, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.04294} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 613.406452] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 613.406632] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 613.406824] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 613.408187] env[60548]: INFO nova.compute.manager [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Took 0.57 seconds to destroy the instance on the hypervisor. [ 613.408542] env[60548]: DEBUG oslo.service.loopingcall [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 613.408765] env[60548]: DEBUG nova.compute.manager [-] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Skipping network deallocation for instance since networking was not requested. {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 613.414120] env[60548]: DEBUG nova.compute.claims [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 613.414315] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 613.414569] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 613.644471] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5c9b310-0d26-4a12-8632-421675c98d41 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.655890] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13052e38-d279-4f44-8ab8-ea2c333831f4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.697563] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68aba690-a5ec-4fcb-826c-ea3554cd57e5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.711213] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0305d516-e19a-4512-92c3-6440a9daa96d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 613.726770] env[60548]: DEBUG nova.compute.provider_tree [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 613.742890] env[60548]: DEBUG nova.scheduler.client.report [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 613.764925] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.350s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 613.766996] env[60548]: ERROR nova.compute.manager [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 613.766996] env[60548]: Faults: ['InvalidArgument'] [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Traceback (most recent call last): [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] self.driver.spawn(context, instance, image_meta, [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] self._fetch_image_if_missing(context, vi) [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] image_cache(vi, tmp_image_ds_loc) [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] vm_util.copy_virtual_disk( [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] session._wait_for_task(vmdk_copy_task) [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] return self.wait_for_task(task_ref) [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] return evt.wait() [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] result = hub.switch() [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] return self.greenlet.switch() [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] self.f(*self.args, **self.kw) [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] raise exceptions.translate_fault(task_info.error) [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Faults: ['InvalidArgument'] [ 613.766996] env[60548]: ERROR nova.compute.manager [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] [ 613.771186] env[60548]: DEBUG nova.compute.utils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] VimFaultException {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 613.773672] env[60548]: DEBUG nova.compute.manager [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Build of instance d76e8d11-53d3-417d-b6a6-08bdff8165d5 was re-scheduled: A specified parameter was not correct: fileType [ 613.773672] env[60548]: Faults: ['InvalidArgument'] {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 613.774738] env[60548]: DEBUG nova.compute.manager [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 613.774738] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Acquiring lock "refresh_cache-d76e8d11-53d3-417d-b6a6-08bdff8165d5" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 613.774738] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Acquired lock "refresh_cache-d76e8d11-53d3-417d-b6a6-08bdff8165d5" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 613.774738] env[60548]: DEBUG nova.network.neutron [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 614.149645] env[60548]: DEBUG nova.network.neutron [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 614.724701] env[60548]: DEBUG nova.network.neutron [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 614.737334] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Releasing lock "refresh_cache-d76e8d11-53d3-417d-b6a6-08bdff8165d5" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 614.740023] env[60548]: DEBUG nova.compute.manager [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 614.740023] env[60548]: DEBUG nova.compute.manager [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] [instance: d76e8d11-53d3-417d-b6a6-08bdff8165d5] Skipping network deallocation for instance since networking was not requested. {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 614.880043] env[60548]: INFO nova.scheduler.client.report [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Deleted allocations for instance d76e8d11-53d3-417d-b6a6-08bdff8165d5 [ 614.910384] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cc008bbb-da29-413c-8d6f-cbe86f0a9d3f tempest-ServerDiagnosticsV248Test-603941409 tempest-ServerDiagnosticsV248Test-603941409-project-member] Lock "d76e8d11-53d3-417d-b6a6-08bdff8165d5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 57.389s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 626.686902] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 626.686902] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 626.707316] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 626.707562] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 626.717806] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.721652] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.721963] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.004s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 626.722174] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60548) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 626.723358] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b823477-9336-4fe9-abbc-ddb17d583ae4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.733979] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad2a79e1-4df8-48e2-88d0-da5439604779 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.755619] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eeac0b27-af7a-4e12-aab5-a0586fc2b16e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.764145] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86ecfffc-d752-429c-8e68-0faa9bed0a37 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.801814] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180688MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60548) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 626.802102] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.802225] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.884974] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance b974272a-5c32-4ed2-99db-1b1ac744d08c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 626.884974] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 626.885152] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 3a7076f4-fc00-4f82-804b-4dac0de9ab3d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 626.885252] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 9a14b9d0-876b-45c6-825e-103caac6bef9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 626.885350] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance b2a1b0c6-91cd-437d-b5bf-ca618b008c23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 626.885468] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance b14d8d3f-9253-4e5f-a5b8-09a2f43888a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 626.886725] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 386edc81-5f27-4e44-af7a-f5e47ded1327 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 626.886725] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 626.886725] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=100GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 626.997506] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb4e9310-3030-4dc5-bc05-38a37095c11c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.007264] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aacac6d8-5713-466b-8635-6580533ea6e8 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.038425] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b68b7ee5-1c77-4dae-a128-10afbf330ecf {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.046960] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9d276be-b15f-4f7b-872c-6bccf099c3f2 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.065039] env[60548]: DEBUG nova.compute.provider_tree [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 627.078300] env[60548]: DEBUG nova.scheduler.client.report [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 627.101491] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60548) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 627.101491] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.298s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.565476] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 627.565671] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Starting heal instance info cache {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 627.565828] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Rebuilding the list of instances to heal {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 627.588252] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 627.589443] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 627.589443] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 627.589443] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 627.589443] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 627.589443] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 627.589630] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 627.589630] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Didn't find any instances for network info cache update. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 627.590192] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 627.590365] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 627.590513] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 627.590651] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 628.171929] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 628.172293] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60548) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 635.981837] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Acquiring lock "2751bdfb-2f28-48e0-98c2-f232ed6da6df" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 635.982594] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Lock "2751bdfb-2f28-48e0-98c2-f232ed6da6df" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 635.997434] env[60548]: DEBUG nova.compute.manager [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 636.059817] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 636.060152] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 636.061778] env[60548]: INFO nova.compute.claims [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 636.273741] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a274c7e6-9abb-439f-90bc-793685c69d66 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.284598] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f79cfde1-8fef-4ff8-9c00-7920f895e3a5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.334050] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9244e49d-c84c-4d52-bd21-d1b4f45b04e6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.346919] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0158942-91de-4db7-82d7-89f33d74a0c6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.364276] env[60548]: DEBUG nova.compute.provider_tree [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 636.374401] env[60548]: DEBUG nova.scheduler.client.report [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 636.401409] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.340s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 636.401409] env[60548]: DEBUG nova.compute.manager [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 636.469942] env[60548]: DEBUG nova.compute.utils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 636.471245] env[60548]: DEBUG nova.compute.manager [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 636.471418] env[60548]: DEBUG nova.network.neutron [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 636.488214] env[60548]: DEBUG nova.compute.manager [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 636.539937] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Acquiring lock "83ecd8bb-ba2b-4151-986b-26f50b54e8e2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 636.540452] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Lock "83ecd8bb-ba2b-4151-986b-26f50b54e8e2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 636.558381] env[60548]: DEBUG nova.compute.manager [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 636.587321] env[60548]: DEBUG nova.compute.manager [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 636.618786] env[60548]: DEBUG nova.virt.hardware [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 636.619049] env[60548]: DEBUG nova.virt.hardware [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 636.619206] env[60548]: DEBUG nova.virt.hardware [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 636.619379] env[60548]: DEBUG nova.virt.hardware [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 636.619516] env[60548]: DEBUG nova.virt.hardware [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 636.619663] env[60548]: DEBUG nova.virt.hardware [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 636.619869] env[60548]: DEBUG nova.virt.hardware [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 636.620199] env[60548]: DEBUG nova.virt.hardware [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 636.620433] env[60548]: DEBUG nova.virt.hardware [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 636.620742] env[60548]: DEBUG nova.virt.hardware [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 636.620806] env[60548]: DEBUG nova.virt.hardware [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 636.621686] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d3e2922-dbb1-4090-9e75-9cdcea48cb94 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.636928] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6df4322b-94e8-4fae-ab74-639c8b6b3a31 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.662777] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 636.663133] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 636.665812] env[60548]: INFO nova.compute.claims [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 636.732792] env[60548]: DEBUG nova.policy [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '51b1b7c43f4d421d86136375ede56642', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1d4a1eeaa09d4f1e8af6e69bd9559be0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 636.920654] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c230229a-129b-4a22-8800-607970058a31 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.929095] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b6c80bb-ce9d-4bda-900a-2813be32773d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.969111] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0540649-e255-4c33-a741-2c1aa356eb45 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.978982] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1cc5b73-7d5d-4a4b-b943-83118f68ebf4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.999986] env[60548]: DEBUG nova.compute.provider_tree [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 637.012873] env[60548]: DEBUG nova.scheduler.client.report [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 637.031027] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.367s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 637.032163] env[60548]: DEBUG nova.compute.manager [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 637.070014] env[60548]: DEBUG nova.compute.utils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 637.073192] env[60548]: DEBUG nova.compute.manager [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 637.073192] env[60548]: DEBUG nova.network.neutron [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 637.120098] env[60548]: DEBUG nova.compute.manager [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 637.215995] env[60548]: DEBUG nova.compute.manager [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 637.247421] env[60548]: DEBUG nova.virt.hardware [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 637.247421] env[60548]: DEBUG nova.virt.hardware [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 637.247592] env[60548]: DEBUG nova.virt.hardware [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 637.247761] env[60548]: DEBUG nova.virt.hardware [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 637.247911] env[60548]: DEBUG nova.virt.hardware [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 637.248332] env[60548]: DEBUG nova.virt.hardware [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 637.248601] env[60548]: DEBUG nova.virt.hardware [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 637.248764] env[60548]: DEBUG nova.virt.hardware [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 637.248932] env[60548]: DEBUG nova.virt.hardware [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 637.250014] env[60548]: DEBUG nova.virt.hardware [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 637.250014] env[60548]: DEBUG nova.virt.hardware [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 637.250373] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d3aecb6-988e-4e29-a6a8-4ced5b763cb6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.264501] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a16d25a-ed2c-43f4-ad63-24db13c1913d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 637.314112] env[60548]: DEBUG nova.policy [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '78216297d3664124b0e2a03b6d970704', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ee8cc26b75e64dbab138f327470b88d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 638.142995] env[60548]: DEBUG nova.network.neutron [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Successfully created port: 87172cab-7cca-4f9e-b9c3-3850418db9e5 {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 638.592810] env[60548]: DEBUG nova.network.neutron [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Successfully created port: 2c311e52-2180-42f1-b2a9-c1f5cc33a26c {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 640.209849] env[60548]: DEBUG nova.network.neutron [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Successfully updated port: 87172cab-7cca-4f9e-b9c3-3850418db9e5 {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 640.250213] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Acquiring lock "refresh_cache-2751bdfb-2f28-48e0-98c2-f232ed6da6df" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 640.250599] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Acquired lock "refresh_cache-2751bdfb-2f28-48e0-98c2-f232ed6da6df" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 640.251261] env[60548]: DEBUG nova.network.neutron [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 640.338050] env[60548]: DEBUG nova.network.neutron [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 640.649420] env[60548]: DEBUG nova.network.neutron [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Successfully updated port: 2c311e52-2180-42f1-b2a9-c1f5cc33a26c {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 640.661294] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Acquiring lock "refresh_cache-83ecd8bb-ba2b-4151-986b-26f50b54e8e2" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 640.661441] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Acquired lock "refresh_cache-83ecd8bb-ba2b-4151-986b-26f50b54e8e2" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 640.661636] env[60548]: DEBUG nova.network.neutron [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 640.762538] env[60548]: DEBUG nova.network.neutron [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 640.804246] env[60548]: DEBUG nova.network.neutron [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Updating instance_info_cache with network_info: [{"id": "87172cab-7cca-4f9e-b9c3-3850418db9e5", "address": "fa:16:3e:96:91:dd", "network": {"id": "62c87eca-755d-47eb-83a1-958d8ced3618", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1948454616-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1d4a1eeaa09d4f1e8af6e69bd9559be0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed3ffc1d-9f86-4029-857e-6cd1d383edbb", "external-id": "nsx-vlan-transportzone-759", "segmentation_id": 759, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap87172cab-7c", "ovs_interfaceid": "87172cab-7cca-4f9e-b9c3-3850418db9e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 640.819243] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Releasing lock "refresh_cache-2751bdfb-2f28-48e0-98c2-f232ed6da6df" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 640.819612] env[60548]: DEBUG nova.compute.manager [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Instance network_info: |[{"id": "87172cab-7cca-4f9e-b9c3-3850418db9e5", "address": "fa:16:3e:96:91:dd", "network": {"id": "62c87eca-755d-47eb-83a1-958d8ced3618", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1948454616-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1d4a1eeaa09d4f1e8af6e69bd9559be0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed3ffc1d-9f86-4029-857e-6cd1d383edbb", "external-id": "nsx-vlan-transportzone-759", "segmentation_id": 759, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap87172cab-7c", "ovs_interfaceid": "87172cab-7cca-4f9e-b9c3-3850418db9e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 640.821232] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:96:91:dd', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ed3ffc1d-9f86-4029-857e-6cd1d383edbb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '87172cab-7cca-4f9e-b9c3-3850418db9e5', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 640.831051] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Creating folder: Project (1d4a1eeaa09d4f1e8af6e69bd9559be0). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 640.831655] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9aa70cb6-68f8-4b23-bd51-6b69a11b3fd3 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.846427] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Created folder: Project (1d4a1eeaa09d4f1e8af6e69bd9559be0) in parent group-v850287. [ 640.846427] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Creating folder: Instances. Parent ref: group-v850312. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 640.846427] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1cd74986-68c8-4a8e-a7ea-25ccde383fe9 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.863698] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Created folder: Instances in parent group-v850312. [ 640.863955] env[60548]: DEBUG oslo.service.loopingcall [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 640.864193] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 640.864410] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-de9e1a77-fa7b-4615-b5a1-d901c905c6fe {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 640.887367] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 640.887367] env[60548]: value = "task-4323304" [ 640.887367] env[60548]: _type = "Task" [ 640.887367] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 640.898097] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323304, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 641.401543] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323304, 'name': CreateVM_Task, 'duration_secs': 0.367455} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 641.401824] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 641.403080] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 641.403080] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 641.403080] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 641.403514] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cf06ba43-bbc2-4b94-9642-e11669c0038f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 641.411476] env[60548]: DEBUG oslo_vmware.api [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Waiting for the task: (returnval){ [ 641.411476] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52ac5afa-1c75-1a3b-b4af-01bfe55d190f" [ 641.411476] env[60548]: _type = "Task" [ 641.411476] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 641.421959] env[60548]: DEBUG oslo_vmware.api [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52ac5afa-1c75-1a3b-b4af-01bfe55d190f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 641.447653] env[60548]: DEBUG nova.network.neutron [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Updating instance_info_cache with network_info: [{"id": "2c311e52-2180-42f1-b2a9-c1f5cc33a26c", "address": "fa:16:3e:1d:14:16", "network": {"id": "bf8bfca4-49a7-4a27-b635-5bf15d24e5dd", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1943277012-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ee8cc26b75e64dbab138f327470b88d4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ee4b2432-c393-4e50-ae0e-b5e12bad37db", "external-id": "nsx-vlan-transportzone-985", "segmentation_id": 985, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2c311e52-21", "ovs_interfaceid": "2c311e52-2180-42f1-b2a9-c1f5cc33a26c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 641.465620] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Releasing lock "refresh_cache-83ecd8bb-ba2b-4151-986b-26f50b54e8e2" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 641.465963] env[60548]: DEBUG nova.compute.manager [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Instance network_info: |[{"id": "2c311e52-2180-42f1-b2a9-c1f5cc33a26c", "address": "fa:16:3e:1d:14:16", "network": {"id": "bf8bfca4-49a7-4a27-b635-5bf15d24e5dd", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1943277012-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ee8cc26b75e64dbab138f327470b88d4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ee4b2432-c393-4e50-ae0e-b5e12bad37db", "external-id": "nsx-vlan-transportzone-985", "segmentation_id": 985, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2c311e52-21", "ovs_interfaceid": "2c311e52-2180-42f1-b2a9-c1f5cc33a26c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 641.466351] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1d:14:16', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ee4b2432-c393-4e50-ae0e-b5e12bad37db', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2c311e52-2180-42f1-b2a9-c1f5cc33a26c', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 641.480608] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Creating folder: Project (ee8cc26b75e64dbab138f327470b88d4). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 641.481097] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-24430916-6a6c-4549-b50c-c39e5da2e690 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 641.495894] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Created folder: Project (ee8cc26b75e64dbab138f327470b88d4) in parent group-v850287. [ 641.495894] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Creating folder: Instances. Parent ref: group-v850315. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 641.495894] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cfcbddcf-39e3-4918-931a-6f1863a2f3b0 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 641.510062] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Created folder: Instances in parent group-v850315. [ 641.510062] env[60548]: DEBUG oslo.service.loopingcall [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 641.510062] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 641.510062] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3dadce1f-19e4-48cf-a591-74c227b37ddb {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 641.536136] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 641.536136] env[60548]: value = "task-4323307" [ 641.536136] env[60548]: _type = "Task" [ 641.536136] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 641.546832] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323307, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 641.923846] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 641.924126] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 641.924466] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 642.047849] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323307, 'name': CreateVM_Task, 'duration_secs': 0.329207} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 642.048045] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 642.048986] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 642.049264] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 642.049699] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 642.051013] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e734312d-d4a3-4be9-9187-d6557d98a132 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 642.058779] env[60548]: DEBUG oslo_vmware.api [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Waiting for the task: (returnval){ [ 642.058779] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]526ed573-205d-f470-7fee-82a0b923fd32" [ 642.058779] env[60548]: _type = "Task" [ 642.058779] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 642.071823] env[60548]: DEBUG oslo_vmware.api [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]526ed573-205d-f470-7fee-82a0b923fd32, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 642.093827] env[60548]: DEBUG nova.compute.manager [req-d67cf7bf-5949-43a1-9f8c-6d6302a73d55 req-062735f8-6efc-41ba-8f06-c18f347a441d service nova] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Received event network-vif-plugged-87172cab-7cca-4f9e-b9c3-3850418db9e5 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 642.094143] env[60548]: DEBUG oslo_concurrency.lockutils [req-d67cf7bf-5949-43a1-9f8c-6d6302a73d55 req-062735f8-6efc-41ba-8f06-c18f347a441d service nova] Acquiring lock "2751bdfb-2f28-48e0-98c2-f232ed6da6df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 642.094288] env[60548]: DEBUG oslo_concurrency.lockutils [req-d67cf7bf-5949-43a1-9f8c-6d6302a73d55 req-062735f8-6efc-41ba-8f06-c18f347a441d service nova] Lock "2751bdfb-2f28-48e0-98c2-f232ed6da6df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 642.094452] env[60548]: DEBUG oslo_concurrency.lockutils [req-d67cf7bf-5949-43a1-9f8c-6d6302a73d55 req-062735f8-6efc-41ba-8f06-c18f347a441d service nova] Lock "2751bdfb-2f28-48e0-98c2-f232ed6da6df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 642.094705] env[60548]: DEBUG nova.compute.manager [req-d67cf7bf-5949-43a1-9f8c-6d6302a73d55 req-062735f8-6efc-41ba-8f06-c18f347a441d service nova] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] No waiting events found dispatching network-vif-plugged-87172cab-7cca-4f9e-b9c3-3850418db9e5 {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 642.094943] env[60548]: WARNING nova.compute.manager [req-d67cf7bf-5949-43a1-9f8c-6d6302a73d55 req-062735f8-6efc-41ba-8f06-c18f347a441d service nova] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Received unexpected event network-vif-plugged-87172cab-7cca-4f9e-b9c3-3850418db9e5 for instance with vm_state building and task_state spawning. [ 642.569863] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 642.570219] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 642.570755] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 646.828080] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Acquiring lock "afb2cdc1-74ec-4d08-85cb-e96b4071f661" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 646.828437] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Lock "afb2cdc1-74ec-4d08-85cb-e96b4071f661" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 646.847840] env[60548]: DEBUG nova.compute.manager [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 646.910588] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 646.910878] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 646.912521] env[60548]: INFO nova.compute.claims [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 647.178126] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-717b18bd-3fc2-44c5-acd2-108a16418ea9 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.193923] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dab61cd7-5b2e-41ee-9758-7efdf088df70 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.232316] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5e41664-ea01-4714-ae5e-2d43249fa950 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.241993] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a233ff77-61b7-4c6f-82d0-3d8149d99cec {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.257924] env[60548]: DEBUG nova.compute.provider_tree [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 647.268793] env[60548]: DEBUG nova.scheduler.client.report [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 647.284122] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.373s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 647.284637] env[60548]: DEBUG nova.compute.manager [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 647.338239] env[60548]: DEBUG nova.compute.utils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 647.341035] env[60548]: DEBUG nova.compute.manager [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 647.341035] env[60548]: DEBUG nova.network.neutron [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 647.355855] env[60548]: DEBUG nova.compute.manager [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 647.450285] env[60548]: DEBUG nova.compute.manager [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 647.483638] env[60548]: DEBUG nova.virt.hardware [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 647.483887] env[60548]: DEBUG nova.virt.hardware [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 647.485165] env[60548]: DEBUG nova.virt.hardware [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 647.485420] env[60548]: DEBUG nova.virt.hardware [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 647.485574] env[60548]: DEBUG nova.virt.hardware [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 647.485722] env[60548]: DEBUG nova.virt.hardware [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 647.486041] env[60548]: DEBUG nova.virt.hardware [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 647.486194] env[60548]: DEBUG nova.virt.hardware [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 647.486362] env[60548]: DEBUG nova.virt.hardware [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 647.486523] env[60548]: DEBUG nova.virt.hardware [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 647.486693] env[60548]: DEBUG nova.virt.hardware [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 647.487663] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-907da7c1-99b8-4f98-850b-17e895305e3a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.493402] env[60548]: DEBUG nova.compute.manager [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Received event network-changed-87172cab-7cca-4f9e-b9c3-3850418db9e5 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 647.493608] env[60548]: DEBUG nova.compute.manager [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Refreshing instance network info cache due to event network-changed-87172cab-7cca-4f9e-b9c3-3850418db9e5. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 647.493817] env[60548]: DEBUG oslo_concurrency.lockutils [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] Acquiring lock "refresh_cache-2751bdfb-2f28-48e0-98c2-f232ed6da6df" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 647.493951] env[60548]: DEBUG oslo_concurrency.lockutils [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] Acquired lock "refresh_cache-2751bdfb-2f28-48e0-98c2-f232ed6da6df" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 647.494114] env[60548]: DEBUG nova.network.neutron [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Refreshing network info cache for port 87172cab-7cca-4f9e-b9c3-3850418db9e5 {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 647.501508] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42d4cedb-be51-44fc-82fe-9a7b47bcbbdb {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 647.765035] env[60548]: DEBUG nova.policy [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9182c7e1ada64547a03c0d3248c361fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b22cd34069e444cfac6558f88d753b56', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 649.009033] env[60548]: DEBUG nova.network.neutron [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Updated VIF entry in instance network info cache for port 87172cab-7cca-4f9e-b9c3-3850418db9e5. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 649.012019] env[60548]: DEBUG nova.network.neutron [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Updating instance_info_cache with network_info: [{"id": "87172cab-7cca-4f9e-b9c3-3850418db9e5", "address": "fa:16:3e:96:91:dd", "network": {"id": "62c87eca-755d-47eb-83a1-958d8ced3618", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1948454616-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1d4a1eeaa09d4f1e8af6e69bd9559be0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed3ffc1d-9f86-4029-857e-6cd1d383edbb", "external-id": "nsx-vlan-transportzone-759", "segmentation_id": 759, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap87172cab-7c", "ovs_interfaceid": "87172cab-7cca-4f9e-b9c3-3850418db9e5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 649.032659] env[60548]: DEBUG oslo_concurrency.lockutils [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] Releasing lock "refresh_cache-2751bdfb-2f28-48e0-98c2-f232ed6da6df" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 649.032920] env[60548]: DEBUG nova.compute.manager [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Received event network-vif-plugged-2c311e52-2180-42f1-b2a9-c1f5cc33a26c {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 649.033230] env[60548]: DEBUG oslo_concurrency.lockutils [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] Acquiring lock "83ecd8bb-ba2b-4151-986b-26f50b54e8e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 649.033315] env[60548]: DEBUG oslo_concurrency.lockutils [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] Lock "83ecd8bb-ba2b-4151-986b-26f50b54e8e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 649.033464] env[60548]: DEBUG oslo_concurrency.lockutils [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] Lock "83ecd8bb-ba2b-4151-986b-26f50b54e8e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 649.033715] env[60548]: DEBUG nova.compute.manager [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] No waiting events found dispatching network-vif-plugged-2c311e52-2180-42f1-b2a9-c1f5cc33a26c {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 649.033776] env[60548]: WARNING nova.compute.manager [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Received unexpected event network-vif-plugged-2c311e52-2180-42f1-b2a9-c1f5cc33a26c for instance with vm_state building and task_state spawning. [ 649.033916] env[60548]: DEBUG nova.compute.manager [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Received event network-changed-2c311e52-2180-42f1-b2a9-c1f5cc33a26c {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 649.034932] env[60548]: DEBUG nova.compute.manager [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Refreshing instance network info cache due to event network-changed-2c311e52-2180-42f1-b2a9-c1f5cc33a26c. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 649.034932] env[60548]: DEBUG oslo_concurrency.lockutils [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] Acquiring lock "refresh_cache-83ecd8bb-ba2b-4151-986b-26f50b54e8e2" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 649.034932] env[60548]: DEBUG oslo_concurrency.lockutils [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] Acquired lock "refresh_cache-83ecd8bb-ba2b-4151-986b-26f50b54e8e2" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 649.034932] env[60548]: DEBUG nova.network.neutron [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Refreshing network info cache for port 2c311e52-2180-42f1-b2a9-c1f5cc33a26c {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 649.467061] env[60548]: DEBUG nova.network.neutron [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Successfully created port: b25225e3-3b7a-4efb-b82f-d2bbd02040ff {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 649.700557] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Acquiring lock "8f10776c-4124-48fb-9135-d674986d4ad3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 649.700557] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Lock "8f10776c-4124-48fb-9135-d674986d4ad3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 649.713335] env[60548]: DEBUG nova.network.neutron [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Updated VIF entry in instance network info cache for port 2c311e52-2180-42f1-b2a9-c1f5cc33a26c. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 649.713690] env[60548]: DEBUG nova.network.neutron [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Updating instance_info_cache with network_info: [{"id": "2c311e52-2180-42f1-b2a9-c1f5cc33a26c", "address": "fa:16:3e:1d:14:16", "network": {"id": "bf8bfca4-49a7-4a27-b635-5bf15d24e5dd", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1943277012-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ee8cc26b75e64dbab138f327470b88d4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ee4b2432-c393-4e50-ae0e-b5e12bad37db", "external-id": "nsx-vlan-transportzone-985", "segmentation_id": 985, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2c311e52-21", "ovs_interfaceid": "2c311e52-2180-42f1-b2a9-c1f5cc33a26c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 649.727494] env[60548]: DEBUG oslo_concurrency.lockutils [req-1fdb8772-c256-4d85-b907-4887f90c5b31 req-fe0b4f3c-8022-452b-bf52-3cad8621d49d service nova] Releasing lock "refresh_cache-83ecd8bb-ba2b-4151-986b-26f50b54e8e2" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 651.490715] env[60548]: DEBUG nova.network.neutron [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Successfully updated port: b25225e3-3b7a-4efb-b82f-d2bbd02040ff {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 651.503944] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Acquiring lock "refresh_cache-afb2cdc1-74ec-4d08-85cb-e96b4071f661" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 651.504283] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Acquired lock "refresh_cache-afb2cdc1-74ec-4d08-85cb-e96b4071f661" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 651.504345] env[60548]: DEBUG nova.network.neutron [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 651.640898] env[60548]: DEBUG nova.network.neutron [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 652.478134] env[60548]: DEBUG nova.network.neutron [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Updating instance_info_cache with network_info: [{"id": "b25225e3-3b7a-4efb-b82f-d2bbd02040ff", "address": "fa:16:3e:ab:f0:ea", "network": {"id": "12caa09b-602a-4ec1-90c9-d72482f4f0e5", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-197335805-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b22cd34069e444cfac6558f88d753b56", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3117b312-701b-4439-b197-96b6c5cdca89", "external-id": "nsx-vlan-transportzone-94", "segmentation_id": 94, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb25225e3-3b", "ovs_interfaceid": "b25225e3-3b7a-4efb-b82f-d2bbd02040ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 652.508994] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Releasing lock "refresh_cache-afb2cdc1-74ec-4d08-85cb-e96b4071f661" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 652.509252] env[60548]: DEBUG nova.compute.manager [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Instance network_info: |[{"id": "b25225e3-3b7a-4efb-b82f-d2bbd02040ff", "address": "fa:16:3e:ab:f0:ea", "network": {"id": "12caa09b-602a-4ec1-90c9-d72482f4f0e5", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-197335805-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b22cd34069e444cfac6558f88d753b56", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3117b312-701b-4439-b197-96b6c5cdca89", "external-id": "nsx-vlan-transportzone-94", "segmentation_id": 94, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb25225e3-3b", "ovs_interfaceid": "b25225e3-3b7a-4efb-b82f-d2bbd02040ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 652.509626] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ab:f0:ea', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3117b312-701b-4439-b197-96b6c5cdca89', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b25225e3-3b7a-4efb-b82f-d2bbd02040ff', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 652.519089] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Creating folder: Project (b22cd34069e444cfac6558f88d753b56). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 652.519716] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-10e79bff-cc07-465a-a021-a0ed33db1183 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 652.532291] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Created folder: Project (b22cd34069e444cfac6558f88d753b56) in parent group-v850287. [ 652.532531] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Creating folder: Instances. Parent ref: group-v850318. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 652.532781] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8ab46391-0cb1-4377-aa0a-21963fb57302 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 652.546954] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Created folder: Instances in parent group-v850318. [ 652.547301] env[60548]: DEBUG oslo.service.loopingcall [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 652.547553] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 652.547999] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3b1bb152-19f4-4293-841e-aa1aa8b336c9 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 652.573684] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 652.573684] env[60548]: value = "task-4323310" [ 652.573684] env[60548]: _type = "Task" [ 652.573684] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 652.590227] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323310, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 653.085475] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323310, 'name': CreateVM_Task} progress is 99%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 653.282228] env[60548]: DEBUG nova.compute.manager [req-e0d29953-b928-4899-aaa5-6424a8fbba75 req-83150e22-fb73-44be-ae2c-809641b3339e service nova] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Received event network-vif-plugged-b25225e3-3b7a-4efb-b82f-d2bbd02040ff {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 653.282228] env[60548]: DEBUG oslo_concurrency.lockutils [req-e0d29953-b928-4899-aaa5-6424a8fbba75 req-83150e22-fb73-44be-ae2c-809641b3339e service nova] Acquiring lock "afb2cdc1-74ec-4d08-85cb-e96b4071f661-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 653.282228] env[60548]: DEBUG oslo_concurrency.lockutils [req-e0d29953-b928-4899-aaa5-6424a8fbba75 req-83150e22-fb73-44be-ae2c-809641b3339e service nova] Lock "afb2cdc1-74ec-4d08-85cb-e96b4071f661-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 653.282228] env[60548]: DEBUG oslo_concurrency.lockutils [req-e0d29953-b928-4899-aaa5-6424a8fbba75 req-83150e22-fb73-44be-ae2c-809641b3339e service nova] Lock "afb2cdc1-74ec-4d08-85cb-e96b4071f661-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 653.282228] env[60548]: DEBUG nova.compute.manager [req-e0d29953-b928-4899-aaa5-6424a8fbba75 req-83150e22-fb73-44be-ae2c-809641b3339e service nova] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] No waiting events found dispatching network-vif-plugged-b25225e3-3b7a-4efb-b82f-d2bbd02040ff {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 653.282228] env[60548]: WARNING nova.compute.manager [req-e0d29953-b928-4899-aaa5-6424a8fbba75 req-83150e22-fb73-44be-ae2c-809641b3339e service nova] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Received unexpected event network-vif-plugged-b25225e3-3b7a-4efb-b82f-d2bbd02040ff for instance with vm_state building and task_state spawning. [ 653.588023] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323310, 'name': CreateVM_Task} progress is 99%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 654.086895] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323310, 'name': CreateVM_Task, 'duration_secs': 1.318836} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 654.087670] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 654.088449] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 654.088604] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 654.089227] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 654.089614] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-865bb9ae-c24d-43ac-ab64-fae94a50185d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 654.095832] env[60548]: DEBUG oslo_vmware.api [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Waiting for the task: (returnval){ [ 654.095832] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]529559f6-7802-87ff-d00b-b4e8ec6da586" [ 654.095832] env[60548]: _type = "Task" [ 654.095832] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 654.104779] env[60548]: DEBUG oslo_vmware.api [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]529559f6-7802-87ff-d00b-b4e8ec6da586, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 654.612464] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 654.612464] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 654.612464] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 654.785761] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquiring lock "2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 654.785761] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Lock "2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.474528] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Acquiring lock "be11788c-634f-40c0-8c8c-d6253d0e68ad" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 655.474785] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Lock "be11788c-634f-40c0-8c8c-d6253d0e68ad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 656.741818] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquiring lock "306f3cb9-3028-4ff2-8090-2c9c1c72efc1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 656.742135] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Lock "306f3cb9-3028-4ff2-8090-2c9c1c72efc1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 657.260255] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Acquiring lock "46737200-2da8-41ee-b33e-3bb6cc3e4618" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 657.260555] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Lock "46737200-2da8-41ee-b33e-3bb6cc3e4618" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 657.740217] env[60548]: DEBUG nova.compute.manager [req-6ded53c8-188d-445a-b1e2-4fa68723cf5d req-2e1feb0a-8026-4cda-9b79-c01810495060 service nova] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Received event network-changed-b25225e3-3b7a-4efb-b82f-d2bbd02040ff {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 657.740430] env[60548]: DEBUG nova.compute.manager [req-6ded53c8-188d-445a-b1e2-4fa68723cf5d req-2e1feb0a-8026-4cda-9b79-c01810495060 service nova] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Refreshing instance network info cache due to event network-changed-b25225e3-3b7a-4efb-b82f-d2bbd02040ff. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 657.740654] env[60548]: DEBUG oslo_concurrency.lockutils [req-6ded53c8-188d-445a-b1e2-4fa68723cf5d req-2e1feb0a-8026-4cda-9b79-c01810495060 service nova] Acquiring lock "refresh_cache-afb2cdc1-74ec-4d08-85cb-e96b4071f661" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 657.740795] env[60548]: DEBUG oslo_concurrency.lockutils [req-6ded53c8-188d-445a-b1e2-4fa68723cf5d req-2e1feb0a-8026-4cda-9b79-c01810495060 service nova] Acquired lock "refresh_cache-afb2cdc1-74ec-4d08-85cb-e96b4071f661" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 657.740994] env[60548]: DEBUG nova.network.neutron [req-6ded53c8-188d-445a-b1e2-4fa68723cf5d req-2e1feb0a-8026-4cda-9b79-c01810495060 service nova] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Refreshing network info cache for port b25225e3-3b7a-4efb-b82f-d2bbd02040ff {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 658.826684] env[60548]: DEBUG nova.network.neutron [req-6ded53c8-188d-445a-b1e2-4fa68723cf5d req-2e1feb0a-8026-4cda-9b79-c01810495060 service nova] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Updated VIF entry in instance network info cache for port b25225e3-3b7a-4efb-b82f-d2bbd02040ff. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 658.826684] env[60548]: DEBUG nova.network.neutron [req-6ded53c8-188d-445a-b1e2-4fa68723cf5d req-2e1feb0a-8026-4cda-9b79-c01810495060 service nova] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Updating instance_info_cache with network_info: [{"id": "b25225e3-3b7a-4efb-b82f-d2bbd02040ff", "address": "fa:16:3e:ab:f0:ea", "network": {"id": "12caa09b-602a-4ec1-90c9-d72482f4f0e5", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-197335805-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b22cd34069e444cfac6558f88d753b56", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3117b312-701b-4439-b197-96b6c5cdca89", "external-id": "nsx-vlan-transportzone-94", "segmentation_id": 94, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb25225e3-3b", "ovs_interfaceid": "b25225e3-3b7a-4efb-b82f-d2bbd02040ff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 658.837975] env[60548]: DEBUG oslo_concurrency.lockutils [req-6ded53c8-188d-445a-b1e2-4fa68723cf5d req-2e1feb0a-8026-4cda-9b79-c01810495060 service nova] Releasing lock "refresh_cache-afb2cdc1-74ec-4d08-85cb-e96b4071f661" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 661.408904] env[60548]: WARNING oslo_vmware.rw_handles [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 661.408904] env[60548]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 661.408904] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 661.408904] env[60548]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 661.408904] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 661.408904] env[60548]: ERROR oslo_vmware.rw_handles response.begin() [ 661.408904] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 661.408904] env[60548]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 661.408904] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 661.408904] env[60548]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 661.408904] env[60548]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 661.408904] env[60548]: ERROR oslo_vmware.rw_handles [ 661.408904] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Downloaded image file data 5674e50f-0c0c-4f19-8379-104dac34660b to vmware_temp/2aabf739-246f-403f-9cdf-f912e8635074/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 661.411354] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Caching image {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 661.411624] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Copying Virtual Disk [datastore1] vmware_temp/2aabf739-246f-403f-9cdf-f912e8635074/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk to [datastore1] vmware_temp/2aabf739-246f-403f-9cdf-f912e8635074/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk {{(pid=60548) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 661.411980] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6b592da3-1e5a-4135-882c-66b7cf0eb3c1 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 661.422155] env[60548]: DEBUG oslo_vmware.api [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Waiting for the task: (returnval){ [ 661.422155] env[60548]: value = "task-4323315" [ 661.422155] env[60548]: _type = "Task" [ 661.422155] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 661.434977] env[60548]: DEBUG oslo_vmware.api [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Task: {'id': task-4323315, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 661.933226] env[60548]: DEBUG oslo_vmware.exceptions [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Fault InvalidArgument not matched. {{(pid=60548) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 661.933525] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 661.934345] env[60548]: ERROR nova.compute.manager [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 661.934345] env[60548]: Faults: ['InvalidArgument'] [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Traceback (most recent call last): [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] yield resources [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] self.driver.spawn(context, instance, image_meta, [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] self._fetch_image_if_missing(context, vi) [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] image_cache(vi, tmp_image_ds_loc) [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] vm_util.copy_virtual_disk( [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] session._wait_for_task(vmdk_copy_task) [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] return self.wait_for_task(task_ref) [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] return evt.wait() [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] result = hub.switch() [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] return self.greenlet.switch() [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] self.f(*self.args, **self.kw) [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] raise exceptions.translate_fault(task_info.error) [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Faults: ['InvalidArgument'] [ 661.934345] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] [ 661.935275] env[60548]: INFO nova.compute.manager [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Terminating instance [ 661.938210] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 661.938210] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 661.938695] env[60548]: DEBUG nova.compute.manager [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 661.938734] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 661.939245] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e3d3e41c-547a-48e8-afc1-bb6728f3669b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 661.943442] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74ca8965-53c5-4db1-91ad-b2cd4c620578 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 661.953894] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 661.954218] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e7441cb2-b8f1-46c6-822c-0e62f062b5d8 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 661.958228] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 661.958438] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 661.959293] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6ca3bda9-bd6b-4aa8-8ef5-9249fca5b26f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 661.967360] env[60548]: DEBUG oslo_vmware.api [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Waiting for the task: (returnval){ [ 661.967360] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]527d96a4-077d-f463-5024-108ea33e56bf" [ 661.967360] env[60548]: _type = "Task" [ 661.967360] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 661.979367] env[60548]: DEBUG oslo_vmware.api [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]527d96a4-077d-f463-5024-108ea33e56bf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 662.028861] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 662.029150] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 662.029335] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Deleting the datastore file [datastore1] 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5 {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 662.029622] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6d3a4560-8c14-4a40-9a53-8b91e14e90c5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 662.041937] env[60548]: DEBUG oslo_vmware.api [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Waiting for the task: (returnval){ [ 662.041937] env[60548]: value = "task-4323318" [ 662.041937] env[60548]: _type = "Task" [ 662.041937] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 662.057080] env[60548]: DEBUG oslo_vmware.api [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Task: {'id': task-4323318, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 662.482739] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 662.484020] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Creating directory with path [datastore1] vmware_temp/8925b799-0070-4170-8c27-194321187f97/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 662.484020] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f093eefa-a32d-4f54-a035-f4df93af71f9 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 662.528208] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Created directory with path [datastore1] vmware_temp/8925b799-0070-4170-8c27-194321187f97/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 662.528208] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Fetch image to [datastore1] vmware_temp/8925b799-0070-4170-8c27-194321187f97/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 662.528208] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/8925b799-0070-4170-8c27-194321187f97/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 662.528208] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba758a82-f5ef-46d5-b18e-4534e6256d5f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 662.543356] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c8182a7-3967-4602-b6f2-8ba34e66993a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 662.565972] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1167de65-f439-4546-bc43-fe805528917c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 662.570259] env[60548]: DEBUG oslo_vmware.api [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Task: {'id': task-4323318, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.196944} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 662.570541] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 662.570711] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 662.570871] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 662.571282] env[60548]: INFO nova.compute.manager [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Took 0.63 seconds to destroy the instance on the hypervisor. [ 662.573417] env[60548]: DEBUG nova.compute.claims [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 662.573592] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 662.573832] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 662.602436] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fcf42e7-689b-45b8-b950-4fbc4c4bc487 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 662.610687] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-72b0b1d8-eca9-4d00-a51c-03bf3271504c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 662.639242] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 662.714052] env[60548]: DEBUG oslo_vmware.rw_handles [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8925b799-0070-4170-8c27-194321187f97/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 662.777239] env[60548]: DEBUG oslo_vmware.rw_handles [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Completed reading data from the image iterator. {{(pid=60548) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 662.777396] env[60548]: DEBUG oslo_vmware.rw_handles [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8925b799-0070-4170-8c27-194321187f97/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 663.014636] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c4e1c45-f773-41d9-baa5-8226332fdfca {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 663.024842] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23816c45-76e9-47ed-b9ed-92839a31e972 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 663.061986] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a412189c-d448-43bc-8ddd-94f0c31c8762 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 663.070395] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-677d5a52-7166-44ca-b4bb-4987a7f5e898 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 663.079327] env[60548]: DEBUG oslo_concurrency.lockutils [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Acquiring lock "67cbaf5c-e743-4e07-8f74-c51e4f57914d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 663.079580] env[60548]: DEBUG oslo_concurrency.lockutils [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Lock "67cbaf5c-e743-4e07-8f74-c51e4f57914d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 663.092800] env[60548]: DEBUG nova.compute.provider_tree [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 663.103040] env[60548]: DEBUG nova.scheduler.client.report [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 663.123622] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.550s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 663.124166] env[60548]: ERROR nova.compute.manager [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 663.124166] env[60548]: Faults: ['InvalidArgument'] [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Traceback (most recent call last): [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] self.driver.spawn(context, instance, image_meta, [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] self._fetch_image_if_missing(context, vi) [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] image_cache(vi, tmp_image_ds_loc) [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] vm_util.copy_virtual_disk( [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] session._wait_for_task(vmdk_copy_task) [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] return self.wait_for_task(task_ref) [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] return evt.wait() [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] result = hub.switch() [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] return self.greenlet.switch() [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] self.f(*self.args, **self.kw) [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] raise exceptions.translate_fault(task_info.error) [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Faults: ['InvalidArgument'] [ 663.124166] env[60548]: ERROR nova.compute.manager [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] [ 663.124959] env[60548]: DEBUG nova.compute.utils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] VimFaultException {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 663.127052] env[60548]: DEBUG nova.compute.manager [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Build of instance 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5 was re-scheduled: A specified parameter was not correct: fileType [ 663.127052] env[60548]: Faults: ['InvalidArgument'] {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 663.127884] env[60548]: DEBUG nova.compute.manager [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 663.127884] env[60548]: DEBUG nova.compute.manager [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 663.127884] env[60548]: DEBUG nova.compute.manager [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 663.127884] env[60548]: DEBUG nova.network.neutron [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 663.955924] env[60548]: DEBUG nova.network.neutron [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 663.972657] env[60548]: INFO nova.compute.manager [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] [instance: 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5] Took 0.84 seconds to deallocate network for instance. [ 664.147133] env[60548]: INFO nova.scheduler.client.report [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Deleted allocations for instance 31e48a76-ffbc-4bd2-a01f-2a69df2de5f5 [ 664.170730] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba8c1356-2ffa-46df-bfce-69b3b7c417f2 tempest-ServerDiagnosticsTest-533831989 tempest-ServerDiagnosticsTest-533831989-project-member] Lock "31e48a76-ffbc-4bd2-a01f-2a69df2de5f5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 106.769s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 664.199247] env[60548]: DEBUG nova.compute.manager [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 664.285336] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 664.285614] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 664.287625] env[60548]: INFO nova.compute.claims [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 664.626990] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0931cf2e-73a5-4a5d-8e55-159653f45f73 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 664.638867] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90282034-4af4-4f41-96dd-57933939c742 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 664.677863] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f749f162-61b2-47d6-b73b-0a840cb0c47d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 664.686130] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d16b441c-2138-4225-9c64-8e7fc7f7ae2f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 664.702563] env[60548]: DEBUG nova.compute.provider_tree [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 664.713349] env[60548]: DEBUG nova.scheduler.client.report [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 664.741890] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.456s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 664.742427] env[60548]: DEBUG nova.compute.manager [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 664.791905] env[60548]: DEBUG nova.compute.utils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 664.793417] env[60548]: DEBUG nova.compute.manager [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 664.793598] env[60548]: DEBUG nova.network.neutron [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 664.820546] env[60548]: DEBUG nova.compute.manager [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 664.919755] env[60548]: DEBUG nova.compute.manager [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 664.951591] env[60548]: DEBUG nova.virt.hardware [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 664.952092] env[60548]: DEBUG nova.virt.hardware [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 664.952092] env[60548]: DEBUG nova.virt.hardware [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 664.954458] env[60548]: DEBUG nova.virt.hardware [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 664.954458] env[60548]: DEBUG nova.virt.hardware [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 664.954458] env[60548]: DEBUG nova.virt.hardware [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 664.954458] env[60548]: DEBUG nova.virt.hardware [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 664.954458] env[60548]: DEBUG nova.virt.hardware [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 664.955573] env[60548]: DEBUG nova.virt.hardware [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 664.955573] env[60548]: DEBUG nova.virt.hardware [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 664.955573] env[60548]: DEBUG nova.virt.hardware [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 664.956999] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2359cee-7cd4-4b14-8125-fab00e10030d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 664.966668] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-308c5f50-64a3-489c-91be-9528475eb425 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.027476] env[60548]: DEBUG nova.policy [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4cf3f3ab34694fa2b92d94ac4942c8fb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '990db39db03d40fcae252ac1848180a1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 665.487274] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Acquiring lock "6774e2f5-99d0-4dc9-9ac0-188b35bd68a2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 665.487517] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Lock "6774e2f5-99d0-4dc9-9ac0-188b35bd68a2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 665.788630] env[60548]: DEBUG nova.network.neutron [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Successfully created port: 09b2c51d-776b-468f-be13-76160943120b {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 665.942176] env[60548]: DEBUG oslo_concurrency.lockutils [None req-adf07f4a-3bca-41f2-9a5f-debef055977c tempest-ServersNegativeTestMultiTenantJSON-819000951 tempest-ServersNegativeTestMultiTenantJSON-819000951-project-member] Acquiring lock "a878722d-7e36-4f15-8c5f-bd473375dd9b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 665.942176] env[60548]: DEBUG oslo_concurrency.lockutils [None req-adf07f4a-3bca-41f2-9a5f-debef055977c tempest-ServersNegativeTestMultiTenantJSON-819000951 tempest-ServersNegativeTestMultiTenantJSON-819000951-project-member] Lock "a878722d-7e36-4f15-8c5f-bd473375dd9b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 666.731918] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e5837970-1bd9-42cd-9028-3a8d80878924 tempest-ServerMetadataNegativeTestJSON-2122666783 tempest-ServerMetadataNegativeTestJSON-2122666783-project-member] Acquiring lock "deffd52b-d708-4c46-a168-18e80b05b133" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 666.732230] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e5837970-1bd9-42cd-9028-3a8d80878924 tempest-ServerMetadataNegativeTestJSON-2122666783 tempest-ServerMetadataNegativeTestJSON-2122666783-project-member] Lock "deffd52b-d708-4c46-a168-18e80b05b133" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.380862] env[60548]: DEBUG nova.network.neutron [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Successfully updated port: 09b2c51d-776b-468f-be13-76160943120b {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 667.403357] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Acquiring lock "refresh_cache-8f10776c-4124-48fb-9135-d674986d4ad3" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 667.403357] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Acquired lock "refresh_cache-8f10776c-4124-48fb-9135-d674986d4ad3" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 667.403357] env[60548]: DEBUG nova.network.neutron [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 667.468433] env[60548]: DEBUG nova.network.neutron [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 667.824101] env[60548]: DEBUG nova.network.neutron [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Updating instance_info_cache with network_info: [{"id": "09b2c51d-776b-468f-be13-76160943120b", "address": "fa:16:3e:5f:19:83", "network": {"id": "28f8a522-1e83-443e-8ec7-85853f3fd595", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1842271253-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "990db39db03d40fcae252ac1848180a1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a8c8175-1197-4f12-baac-ef6aba95f585", "external-id": "nsx-vlan-transportzone-832", "segmentation_id": 832, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap09b2c51d-77", "ovs_interfaceid": "09b2c51d-776b-468f-be13-76160943120b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 667.849359] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Releasing lock "refresh_cache-8f10776c-4124-48fb-9135-d674986d4ad3" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 667.849691] env[60548]: DEBUG nova.compute.manager [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Instance network_info: |[{"id": "09b2c51d-776b-468f-be13-76160943120b", "address": "fa:16:3e:5f:19:83", "network": {"id": "28f8a522-1e83-443e-8ec7-85853f3fd595", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1842271253-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "990db39db03d40fcae252ac1848180a1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a8c8175-1197-4f12-baac-ef6aba95f585", "external-id": "nsx-vlan-transportzone-832", "segmentation_id": 832, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap09b2c51d-77", "ovs_interfaceid": "09b2c51d-776b-468f-be13-76160943120b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 667.850073] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5f:19:83', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1a8c8175-1197-4f12-baac-ef6aba95f585', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '09b2c51d-776b-468f-be13-76160943120b', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 667.859940] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Creating folder: Project (990db39db03d40fcae252ac1848180a1). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 667.860900] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c6301f42-0da9-4532-bd43-6a1d52b72f5d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.873304] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Created folder: Project (990db39db03d40fcae252ac1848180a1) in parent group-v850287. [ 667.873304] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Creating folder: Instances. Parent ref: group-v850324. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 667.873304] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-cf2b3111-2abf-48fe-b09e-29e4e2decf5f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.883528] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Created folder: Instances in parent group-v850324. [ 667.883528] env[60548]: DEBUG oslo.service.loopingcall [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 667.885889] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 667.885889] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-727b19df-063b-48d6-9672-72b32f732a31 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 667.912380] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 667.912380] env[60548]: value = "task-4323323" [ 667.912380] env[60548]: _type = "Task" [ 667.912380] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 667.919721] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323323, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 668.421985] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323323, 'name': CreateVM_Task, 'duration_secs': 0.359748} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 668.423116] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 668.425807] env[60548]: DEBUG oslo_vmware.service [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d8aba4d-c0a9-47fc-a3ec-fb104816359b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.432748] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 668.432748] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Acquired lock "[datastore2] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 668.432982] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 668.433276] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a8a32500-347e-48c2-b033-586dfefdf844 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.443094] env[60548]: DEBUG oslo_vmware.api [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Waiting for the task: (returnval){ [ 668.443094] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]527ea2bd-967c-95b6-4536-299e2587c8ab" [ 668.443094] env[60548]: _type = "Task" [ 668.443094] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 668.454029] env[60548]: DEBUG oslo_vmware.api [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]527ea2bd-967c-95b6-4536-299e2587c8ab, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 668.556722] env[60548]: DEBUG nova.compute.manager [req-c1ab696e-41ab-4a17-b902-abaeba4057a4 req-be41b977-87c7-4966-b828-6944ee97b777 service nova] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Received event network-vif-plugged-09b2c51d-776b-468f-be13-76160943120b {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 668.560023] env[60548]: DEBUG oslo_concurrency.lockutils [req-c1ab696e-41ab-4a17-b902-abaeba4057a4 req-be41b977-87c7-4966-b828-6944ee97b777 service nova] Acquiring lock "8f10776c-4124-48fb-9135-d674986d4ad3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 668.560023] env[60548]: DEBUG oslo_concurrency.lockutils [req-c1ab696e-41ab-4a17-b902-abaeba4057a4 req-be41b977-87c7-4966-b828-6944ee97b777 service nova] Lock "8f10776c-4124-48fb-9135-d674986d4ad3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 668.560023] env[60548]: DEBUG oslo_concurrency.lockutils [req-c1ab696e-41ab-4a17-b902-abaeba4057a4 req-be41b977-87c7-4966-b828-6944ee97b777 service nova] Lock "8f10776c-4124-48fb-9135-d674986d4ad3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.560023] env[60548]: DEBUG nova.compute.manager [req-c1ab696e-41ab-4a17-b902-abaeba4057a4 req-be41b977-87c7-4966-b828-6944ee97b777 service nova] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] No waiting events found dispatching network-vif-plugged-09b2c51d-776b-468f-be13-76160943120b {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 668.560023] env[60548]: WARNING nova.compute.manager [req-c1ab696e-41ab-4a17-b902-abaeba4057a4 req-be41b977-87c7-4966-b828-6944ee97b777 service nova] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Received unexpected event network-vif-plugged-09b2c51d-776b-468f-be13-76160943120b for instance with vm_state building and task_state spawning. [ 668.560023] env[60548]: DEBUG nova.compute.manager [req-c1ab696e-41ab-4a17-b902-abaeba4057a4 req-be41b977-87c7-4966-b828-6944ee97b777 service nova] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Received event network-changed-09b2c51d-776b-468f-be13-76160943120b {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 668.560023] env[60548]: DEBUG nova.compute.manager [req-c1ab696e-41ab-4a17-b902-abaeba4057a4 req-be41b977-87c7-4966-b828-6944ee97b777 service nova] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Refreshing instance network info cache due to event network-changed-09b2c51d-776b-468f-be13-76160943120b. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 668.560023] env[60548]: DEBUG oslo_concurrency.lockutils [req-c1ab696e-41ab-4a17-b902-abaeba4057a4 req-be41b977-87c7-4966-b828-6944ee97b777 service nova] Acquiring lock "refresh_cache-8f10776c-4124-48fb-9135-d674986d4ad3" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 668.560023] env[60548]: DEBUG oslo_concurrency.lockutils [req-c1ab696e-41ab-4a17-b902-abaeba4057a4 req-be41b977-87c7-4966-b828-6944ee97b777 service nova] Acquired lock "refresh_cache-8f10776c-4124-48fb-9135-d674986d4ad3" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 668.560023] env[60548]: DEBUG nova.network.neutron [req-c1ab696e-41ab-4a17-b902-abaeba4057a4 req-be41b977-87c7-4966-b828-6944ee97b777 service nova] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Refreshing network info cache for port 09b2c51d-776b-468f-be13-76160943120b {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 668.953764] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Releasing lock "[datastore2] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 668.953764] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 668.953764] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 668.953764] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Acquired lock "[datastore2] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 668.953764] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 668.954190] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d23f0668-f8d9-43c0-9373-748bbd2e0ad3 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.963375] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 668.963591] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 668.964880] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f77cd2f-a1b1-485b-808d-176d5d54f211 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.975310] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0a32621b-5bd8-4e90-9abd-a59509777374 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.986369] env[60548]: DEBUG oslo_vmware.api [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Waiting for the task: (returnval){ [ 668.986369] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]528ce787-00aa-f810-ae76-1d6dfd4cb67d" [ 668.986369] env[60548]: _type = "Task" [ 668.986369] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 668.995116] env[60548]: DEBUG oslo_vmware.api [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]528ce787-00aa-f810-ae76-1d6dfd4cb67d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 669.110282] env[60548]: DEBUG nova.network.neutron [req-c1ab696e-41ab-4a17-b902-abaeba4057a4 req-be41b977-87c7-4966-b828-6944ee97b777 service nova] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Updated VIF entry in instance network info cache for port 09b2c51d-776b-468f-be13-76160943120b. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 669.110780] env[60548]: DEBUG nova.network.neutron [req-c1ab696e-41ab-4a17-b902-abaeba4057a4 req-be41b977-87c7-4966-b828-6944ee97b777 service nova] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Updating instance_info_cache with network_info: [{"id": "09b2c51d-776b-468f-be13-76160943120b", "address": "fa:16:3e:5f:19:83", "network": {"id": "28f8a522-1e83-443e-8ec7-85853f3fd595", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1842271253-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "990db39db03d40fcae252ac1848180a1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a8c8175-1197-4f12-baac-ef6aba95f585", "external-id": "nsx-vlan-transportzone-832", "segmentation_id": 832, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap09b2c51d-77", "ovs_interfaceid": "09b2c51d-776b-468f-be13-76160943120b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 669.123777] env[60548]: DEBUG oslo_concurrency.lockutils [req-c1ab696e-41ab-4a17-b902-abaeba4057a4 req-be41b977-87c7-4966-b828-6944ee97b777 service nova] Releasing lock "refresh_cache-8f10776c-4124-48fb-9135-d674986d4ad3" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 669.507677] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 669.508015] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Creating directory with path [datastore2] vmware_temp/c2356668-dd2b-406b-8371-5513bab1f2d9/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 669.508286] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f37e3eee-f72f-4046-ae65-d47f608f44d0 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 669.534033] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Created directory with path [datastore2] vmware_temp/c2356668-dd2b-406b-8371-5513bab1f2d9/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 669.534033] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Fetch image to [datastore2] vmware_temp/c2356668-dd2b-406b-8371-5513bab1f2d9/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 669.534293] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore2] vmware_temp/c2356668-dd2b-406b-8371-5513bab1f2d9/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore2 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 669.535016] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1121e376-1d41-43c8-b12d-17ae5d572793 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 669.545483] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42eb0cc3-0ca8-4a7b-a951-643a041da8fa {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 669.558394] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-221b572e-3f16-4187-99c3-5b4d908000a6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 669.594919] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-957d5ec6-c07b-4ec1-8e39-2ee14053f5f7 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 669.603860] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ba4cd0b2-00a9-41a6-98cb-f2ee66b4953f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 669.626797] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore2 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 669.699585] env[60548]: DEBUG oslo_vmware.rw_handles [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c2356668-dd2b-406b-8371-5513bab1f2d9/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60548) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 669.765789] env[60548]: DEBUG oslo_vmware.rw_handles [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Completed reading data from the image iterator. {{(pid=60548) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 669.765882] env[60548]: DEBUG oslo_vmware.rw_handles [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c2356668-dd2b-406b-8371-5513bab1f2d9/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60548) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 670.034917] env[60548]: DEBUG oslo_concurrency.lockutils [None req-34948935-a815-4d21-8ed3-ddfac054bc11 tempest-ServerTagsTestJSON-1582888721 tempest-ServerTagsTestJSON-1582888721-project-member] Acquiring lock "bf1694f2-6ad0-4e15-b05d-c73c24e0e955" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 670.035234] env[60548]: DEBUG oslo_concurrency.lockutils [None req-34948935-a815-4d21-8ed3-ddfac054bc11 tempest-ServerTagsTestJSON-1582888721 tempest-ServerTagsTestJSON-1582888721-project-member] Lock "bf1694f2-6ad0-4e15-b05d-c73c24e0e955" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 670.382775] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7c9a00f6-9343-4899-83da-79ed712f6304 tempest-ServerActionsTestOtherB-1079721433 tempest-ServerActionsTestOtherB-1079721433-project-member] Acquiring lock "ecc4262d-6133-4541-aec8-fbee05180701" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 670.383235] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7c9a00f6-9343-4899-83da-79ed712f6304 tempest-ServerActionsTestOtherB-1079721433 tempest-ServerActionsTestOtherB-1079721433-project-member] Lock "ecc4262d-6133-4541-aec8-fbee05180701" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 678.345128] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb5badd8-d0c4-4998-b0c5-30e887803cef tempest-ServerActionsV293TestJSON-390462293 tempest-ServerActionsV293TestJSON-390462293-project-member] Acquiring lock "c1b6b578-1bed-4e6c-8e7a-34c2e469cd80" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 678.345128] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb5badd8-d0c4-4998-b0c5-30e887803cef tempest-ServerActionsV293TestJSON-390462293 tempest-ServerActionsV293TestJSON-390462293-project-member] Lock "c1b6b578-1bed-4e6c-8e7a-34c2e469cd80" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 680.539992] env[60548]: DEBUG oslo_concurrency.lockutils [None req-87481eb0-9c3c-4944-8961-f008b897cc07 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Acquiring lock "fd3e6440-74fc-4425-9b7e-571245ddc379" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 680.540762] env[60548]: DEBUG oslo_concurrency.lockutils [None req-87481eb0-9c3c-4944-8961-f008b897cc07 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Lock "fd3e6440-74fc-4425-9b7e-571245ddc379" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.420824] env[60548]: DEBUG oslo_concurrency.lockutils [None req-dba59053-e9d1-4c49-bf41-31c2f306dde2 tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquiring lock "fbf4b0bc-d87a-4063-87ef-e44f72ecbf5c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.421485] env[60548]: DEBUG oslo_concurrency.lockutils [None req-dba59053-e9d1-4c49-bf41-31c2f306dde2 tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Lock "fbf4b0bc-d87a-4063-87ef-e44f72ecbf5c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.459621] env[60548]: DEBUG oslo_concurrency.lockutils [None req-dba59053-e9d1-4c49-bf41-31c2f306dde2 tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquiring lock "81c71aa0-9c68-407a-9bef-708c2cb70b12" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.459990] env[60548]: DEBUG oslo_concurrency.lockutils [None req-dba59053-e9d1-4c49-bf41-31c2f306dde2 tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Lock "81c71aa0-9c68-407a-9bef-708c2cb70b12" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 687.166618] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 687.171291] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 687.171495] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 687.171666] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 688.171551] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 688.171809] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Starting heal instance info cache {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 688.171919] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Rebuilding the list of instances to heal {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 688.202106] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 688.202357] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 688.202511] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 688.202650] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 688.202773] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 688.202896] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 688.203025] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 688.203143] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 688.203330] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 688.203498] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 688.203623] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Didn't find any instances for network info cache update. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 688.204222] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 688.204479] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 688.204593] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60548) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 688.204743] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 688.225117] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.225117] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.225397] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 688.225397] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60548) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 688.226424] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba80cda6-c7c6-4fe9-9d42-f8b6f73ce2ca {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 688.240109] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20e719a8-58ea-40c9-bef6-55a5aabdd3ee {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 688.255494] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ba96136-1ef8-4d33-9861-978d074fe809 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 688.263538] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78960992-23ba-412a-abfc-be29790522dc {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 688.296244] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180650MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60548) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 688.296480] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.296831] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 688.386224] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance b974272a-5c32-4ed2-99db-1b1ac744d08c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 688.386224] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 3a7076f4-fc00-4f82-804b-4dac0de9ab3d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 688.386224] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 9a14b9d0-876b-45c6-825e-103caac6bef9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 688.386224] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance b2a1b0c6-91cd-437d-b5bf-ca618b008c23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 688.386224] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance b14d8d3f-9253-4e5f-a5b8-09a2f43888a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 688.386224] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 386edc81-5f27-4e44-af7a-f5e47ded1327 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 688.386530] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 2751bdfb-2f28-48e0-98c2-f232ed6da6df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 688.386530] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 83ecd8bb-ba2b-4151-986b-26f50b54e8e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 688.386530] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance afb2cdc1-74ec-4d08-85cb-e96b4071f661 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 688.386750] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 8f10776c-4124-48fb-9135-d674986d4ad3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 688.422227] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 688.449230] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance be11788c-634f-40c0-8c8c-d6253d0e68ad has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 688.463441] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 306f3cb9-3028-4ff2-8090-2c9c1c72efc1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 688.482809] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 46737200-2da8-41ee-b33e-3bb6cc3e4618 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 688.496270] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 67cbaf5c-e743-4e07-8f74-c51e4f57914d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 688.512067] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 688.527569] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance a878722d-7e36-4f15-8c5f-bd473375dd9b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 688.542296] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance deffd52b-d708-4c46-a168-18e80b05b133 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 688.557274] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance bf1694f2-6ad0-4e15-b05d-c73c24e0e955 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 688.570582] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance ecc4262d-6133-4541-aec8-fbee05180701 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 688.581890] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance c1b6b578-1bed-4e6c-8e7a-34c2e469cd80 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 688.593099] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance fd3e6440-74fc-4425-9b7e-571245ddc379 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 688.604975] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance fbf4b0bc-d87a-4063-87ef-e44f72ecbf5c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 688.615512] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 81c71aa0-9c68-407a-9bef-708c2cb70b12 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 688.615816] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 688.615976] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=100GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 689.001657] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1fe3c03-4331-4959-8f1d-d02d019b13ac {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.010642] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84020a5a-2871-407d-b525-460db6861a85 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.041936] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87d7dba3-4e36-4117-ad61-cbfc28dd7265 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.051168] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62df550d-8a73-4640-b24d-11e7db626af9 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.066534] env[60548]: DEBUG nova.compute.provider_tree [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 689.075570] env[60548]: DEBUG nova.scheduler.client.report [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 689.095610] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60548) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 689.095610] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.798s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.061578] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 690.494736] env[60548]: DEBUG oslo_concurrency.lockutils [None req-46fa9596-66bc-4024-b931-31353f9cf956 tempest-ServerRescueNegativeTestJSON-12688595 tempest-ServerRescueNegativeTestJSON-12688595-project-member] Acquiring lock "bdae41ee-e9c3-4272-9623-ca88464ec45a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.494975] env[60548]: DEBUG oslo_concurrency.lockutils [None req-46fa9596-66bc-4024-b931-31353f9cf956 tempest-ServerRescueNegativeTestJSON-12688595 tempest-ServerRescueNegativeTestJSON-12688595-project-member] Lock "bdae41ee-e9c3-4272-9623-ca88464ec45a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.390655] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cf73e27a-dcd0-427f-b470-9f587fb5df7d tempest-ServerRescueNegativeTestJSON-12688595 tempest-ServerRescueNegativeTestJSON-12688595-project-member] Acquiring lock "3a0515f1-e61a-48d4-980d-49c7189dca2d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.390950] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cf73e27a-dcd0-427f-b470-9f587fb5df7d tempest-ServerRescueNegativeTestJSON-12688595 tempest-ServerRescueNegativeTestJSON-12688595-project-member] Lock "3a0515f1-e61a-48d4-980d-49c7189dca2d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.427498] env[60548]: WARNING oslo_vmware.rw_handles [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 711.427498] env[60548]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 711.427498] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 711.427498] env[60548]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 711.427498] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 711.427498] env[60548]: ERROR oslo_vmware.rw_handles response.begin() [ 711.427498] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 711.427498] env[60548]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 711.427498] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 711.427498] env[60548]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 711.427498] env[60548]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 711.427498] env[60548]: ERROR oslo_vmware.rw_handles [ 711.428158] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Downloaded image file data 5674e50f-0c0c-4f19-8379-104dac34660b to vmware_temp/8925b799-0070-4170-8c27-194321187f97/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 711.429651] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Caching image {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 711.430192] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Copying Virtual Disk [datastore1] vmware_temp/8925b799-0070-4170-8c27-194321187f97/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk to [datastore1] vmware_temp/8925b799-0070-4170-8c27-194321187f97/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk {{(pid=60548) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 711.430536] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d95c57bd-539b-42d1-8844-a49a7f64c3d0 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.440036] env[60548]: DEBUG oslo_vmware.api [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Waiting for the task: (returnval){ [ 711.440036] env[60548]: value = "task-4323328" [ 711.440036] env[60548]: _type = "Task" [ 711.440036] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 711.449427] env[60548]: DEBUG oslo_vmware.api [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Task: {'id': task-4323328, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 711.951068] env[60548]: DEBUG oslo_vmware.exceptions [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Fault InvalidArgument not matched. {{(pid=60548) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 711.951381] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 711.951932] env[60548]: ERROR nova.compute.manager [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 711.951932] env[60548]: Faults: ['InvalidArgument'] [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Traceback (most recent call last): [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] yield resources [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] self.driver.spawn(context, instance, image_meta, [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] self._fetch_image_if_missing(context, vi) [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] image_cache(vi, tmp_image_ds_loc) [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] vm_util.copy_virtual_disk( [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] session._wait_for_task(vmdk_copy_task) [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] return self.wait_for_task(task_ref) [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] return evt.wait() [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] result = hub.switch() [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] return self.greenlet.switch() [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] self.f(*self.args, **self.kw) [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] raise exceptions.translate_fault(task_info.error) [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Faults: ['InvalidArgument'] [ 711.951932] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] [ 711.952832] env[60548]: INFO nova.compute.manager [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Terminating instance [ 711.953806] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 711.954019] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 711.954278] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6b35d48a-606c-4bec-9ed5-222fb7ec7f63 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.956763] env[60548]: DEBUG nova.compute.manager [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 711.956950] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 711.957737] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e5710ca-4d95-4dce-89a9-b0b258bbf2d7 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.964680] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 711.964909] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-20faa2a5-78e1-45a1-99e5-56b582a9ae57 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.967319] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 711.967560] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 711.968474] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3836c886-55ca-4aee-a8b3-0190207b6e0a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.973902] env[60548]: DEBUG oslo_vmware.api [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Waiting for the task: (returnval){ [ 711.973902] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]5251a46e-184f-2ecf-adfe-6fd911cb6b84" [ 711.973902] env[60548]: _type = "Task" [ 711.973902] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 711.981765] env[60548]: DEBUG oslo_vmware.api [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]5251a46e-184f-2ecf-adfe-6fd911cb6b84, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 712.032562] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 712.033088] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 712.033088] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Deleting the datastore file [datastore1] b974272a-5c32-4ed2-99db-1b1ac744d08c {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 712.033268] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c743ada9-50c0-45bc-b474-629ecde0c81a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.040585] env[60548]: DEBUG oslo_vmware.api [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Waiting for the task: (returnval){ [ 712.040585] env[60548]: value = "task-4323330" [ 712.040585] env[60548]: _type = "Task" [ 712.040585] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 712.049074] env[60548]: DEBUG oslo_vmware.api [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Task: {'id': task-4323330, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 712.486691] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 712.486691] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Creating directory with path [datastore1] vmware_temp/30d80ae1-bdff-46bc-a12f-c0ddf084f4c2/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 712.487053] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-643cc9de-2b3b-4e3a-91e0-8ca5e958f99b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.500185] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Created directory with path [datastore1] vmware_temp/30d80ae1-bdff-46bc-a12f-c0ddf084f4c2/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 712.500384] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Fetch image to [datastore1] vmware_temp/30d80ae1-bdff-46bc-a12f-c0ddf084f4c2/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 712.500545] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/30d80ae1-bdff-46bc-a12f-c0ddf084f4c2/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 712.501315] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bf28a4d-ae27-4db1-8641-638a51bb7351 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.508461] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8bf4b89-b3f3-4080-a80c-901eeb10946d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.517750] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9f955c8-0b60-4995-93f7-3bfa9aba62f5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.551638] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6af66d8a-812d-4ff2-914a-6d369b2234a4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.560767] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4cb20f14-6306-42ca-8fc1-e9174163bd93 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.562596] env[60548]: DEBUG oslo_vmware.api [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Task: {'id': task-4323330, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082005} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 712.562827] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 712.563059] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 712.563289] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 712.563472] env[60548]: INFO nova.compute.manager [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Took 0.61 seconds to destroy the instance on the hypervisor. [ 712.566181] env[60548]: DEBUG nova.compute.claims [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 712.566369] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 712.566618] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 712.590280] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 712.638998] env[60548]: DEBUG oslo_vmware.rw_handles [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/30d80ae1-bdff-46bc-a12f-c0ddf084f4c2/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 712.698225] env[60548]: DEBUG oslo_vmware.rw_handles [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Completed reading data from the image iterator. {{(pid=60548) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 712.698471] env[60548]: DEBUG oslo_vmware.rw_handles [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/30d80ae1-bdff-46bc-a12f-c0ddf084f4c2/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 712.972139] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a7ffb24-820c-4328-969e-47b1fd75511f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.980899] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1a31f8f-9b8d-46a0-b5c7-b64f49329d62 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.014316] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a240854-7035-4cc7-a273-a9c9a1eb2cfb {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.022593] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c53eb6b-3ad9-4ad9-8574-e2c6d635e11d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 713.036804] env[60548]: DEBUG nova.compute.provider_tree [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 713.045361] env[60548]: DEBUG nova.scheduler.client.report [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 713.061172] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.494s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.061723] env[60548]: ERROR nova.compute.manager [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 713.061723] env[60548]: Faults: ['InvalidArgument'] [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Traceback (most recent call last): [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] self.driver.spawn(context, instance, image_meta, [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] self._fetch_image_if_missing(context, vi) [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] image_cache(vi, tmp_image_ds_loc) [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] vm_util.copy_virtual_disk( [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] session._wait_for_task(vmdk_copy_task) [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] return self.wait_for_task(task_ref) [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] return evt.wait() [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] result = hub.switch() [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] return self.greenlet.switch() [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] self.f(*self.args, **self.kw) [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] raise exceptions.translate_fault(task_info.error) [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Faults: ['InvalidArgument'] [ 713.061723] env[60548]: ERROR nova.compute.manager [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] [ 713.062552] env[60548]: DEBUG nova.compute.utils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] VimFaultException {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 713.064015] env[60548]: DEBUG nova.compute.manager [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Build of instance b974272a-5c32-4ed2-99db-1b1ac744d08c was re-scheduled: A specified parameter was not correct: fileType [ 713.064015] env[60548]: Faults: ['InvalidArgument'] {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 713.064399] env[60548]: DEBUG nova.compute.manager [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 713.064568] env[60548]: DEBUG nova.compute.manager [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 713.064721] env[60548]: DEBUG nova.compute.manager [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 713.064879] env[60548]: DEBUG nova.network.neutron [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 713.573555] env[60548]: DEBUG nova.network.neutron [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.583245] env[60548]: INFO nova.compute.manager [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] [instance: b974272a-5c32-4ed2-99db-1b1ac744d08c] Took 0.52 seconds to deallocate network for instance. [ 713.680156] env[60548]: INFO nova.scheduler.client.report [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Deleted allocations for instance b974272a-5c32-4ed2-99db-1b1ac744d08c [ 713.697363] env[60548]: DEBUG oslo_concurrency.lockutils [None req-252c52dd-2b0e-4ee7-8e94-1b981f114895 tempest-ServerExternalEventsTest-1040538488 tempest-ServerExternalEventsTest-1040538488-project-member] Lock "b974272a-5c32-4ed2-99db-1b1ac744d08c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 158.556s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 713.715498] env[60548]: DEBUG nova.compute.manager [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 713.765972] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 713.766297] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 713.767899] env[60548]: INFO nova.compute.claims [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 714.121936] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d55dd97-8e2a-443e-8664-fdb858a67931 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.130593] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-148205f5-f8e6-4385-b5e2-c4f998e92c70 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.160499] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d8c5a26-d669-4dcd-8b52-5e315513d8f4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.168469] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e707b5ab-05ff-4015-a9b8-0c9641002df3 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.185044] env[60548]: DEBUG nova.compute.provider_tree [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 714.194139] env[60548]: DEBUG nova.scheduler.client.report [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 714.207396] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.441s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 714.207985] env[60548]: DEBUG nova.compute.manager [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 714.239855] env[60548]: DEBUG nova.compute.utils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 714.241642] env[60548]: DEBUG nova.compute.manager [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 714.241906] env[60548]: DEBUG nova.network.neutron [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 714.250317] env[60548]: DEBUG nova.compute.manager [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 714.316205] env[60548]: DEBUG nova.compute.manager [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 714.338803] env[60548]: DEBUG nova.virt.hardware [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 714.339206] env[60548]: DEBUG nova.virt.hardware [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 714.339429] env[60548]: DEBUG nova.virt.hardware [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 714.339685] env[60548]: DEBUG nova.virt.hardware [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 714.339901] env[60548]: DEBUG nova.virt.hardware [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 714.340098] env[60548]: DEBUG nova.virt.hardware [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 714.340349] env[60548]: DEBUG nova.virt.hardware [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 714.340542] env[60548]: DEBUG nova.virt.hardware [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 714.340744] env[60548]: DEBUG nova.virt.hardware [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 714.340938] env[60548]: DEBUG nova.virt.hardware [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 714.341154] env[60548]: DEBUG nova.virt.hardware [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 714.342036] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01461204-9045-4ff8-91f6-388681bbcdfd {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.351214] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e104069d-460e-427c-9fd0-3992bd6cb43c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 714.550137] env[60548]: DEBUG nova.policy [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a55618f8417d4926a3441e2620494eb0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cce08d1ae47c4fc1b1f01412cac916bd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 715.350520] env[60548]: DEBUG nova.network.neutron [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Successfully created port: 7696794d-4c5d-406f-b4e5-b0e3e34a7019 {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 716.227441] env[60548]: DEBUG nova.compute.manager [req-399cc846-5237-44a0-8058-827f7f41e6a4 req-a328f7a0-bb7c-43e4-b0da-078c526df5d5 service nova] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Received event network-vif-plugged-7696794d-4c5d-406f-b4e5-b0e3e34a7019 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 716.227697] env[60548]: DEBUG oslo_concurrency.lockutils [req-399cc846-5237-44a0-8058-827f7f41e6a4 req-a328f7a0-bb7c-43e4-b0da-078c526df5d5 service nova] Acquiring lock "2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.227947] env[60548]: DEBUG oslo_concurrency.lockutils [req-399cc846-5237-44a0-8058-827f7f41e6a4 req-a328f7a0-bb7c-43e4-b0da-078c526df5d5 service nova] Lock "2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.228715] env[60548]: DEBUG oslo_concurrency.lockutils [req-399cc846-5237-44a0-8058-827f7f41e6a4 req-a328f7a0-bb7c-43e4-b0da-078c526df5d5 service nova] Lock "2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 716.229199] env[60548]: DEBUG nova.compute.manager [req-399cc846-5237-44a0-8058-827f7f41e6a4 req-a328f7a0-bb7c-43e4-b0da-078c526df5d5 service nova] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] No waiting events found dispatching network-vif-plugged-7696794d-4c5d-406f-b4e5-b0e3e34a7019 {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 716.229199] env[60548]: WARNING nova.compute.manager [req-399cc846-5237-44a0-8058-827f7f41e6a4 req-a328f7a0-bb7c-43e4-b0da-078c526df5d5 service nova] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Received unexpected event network-vif-plugged-7696794d-4c5d-406f-b4e5-b0e3e34a7019 for instance with vm_state building and task_state spawning. [ 716.401908] env[60548]: DEBUG nova.network.neutron [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Successfully updated port: 7696794d-4c5d-406f-b4e5-b0e3e34a7019 {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 716.412588] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquiring lock "refresh_cache-2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 716.412588] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquired lock "refresh_cache-2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 716.412588] env[60548]: DEBUG nova.network.neutron [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 716.496038] env[60548]: DEBUG nova.network.neutron [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 716.784204] env[60548]: DEBUG nova.network.neutron [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Updating instance_info_cache with network_info: [{"id": "7696794d-4c5d-406f-b4e5-b0e3e34a7019", "address": "fa:16:3e:93:c6:11", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7696794d-4c", "ovs_interfaceid": "7696794d-4c5d-406f-b4e5-b0e3e34a7019", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 716.799791] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Releasing lock "refresh_cache-2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 716.800136] env[60548]: DEBUG nova.compute.manager [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Instance network_info: |[{"id": "7696794d-4c5d-406f-b4e5-b0e3e34a7019", "address": "fa:16:3e:93:c6:11", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7696794d-4c", "ovs_interfaceid": "7696794d-4c5d-406f-b4e5-b0e3e34a7019", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 716.800515] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:93:c6:11', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '15922696-dc08-44ef-97be-0b09a9dfeae8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7696794d-4c5d-406f-b4e5-b0e3e34a7019', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 716.808621] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Creating folder: Project (cce08d1ae47c4fc1b1f01412cac916bd). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 716.809424] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b40c769a-a6c8-4729-9482-7b05ba0773e3 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.823437] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Created folder: Project (cce08d1ae47c4fc1b1f01412cac916bd) in parent group-v850287. [ 716.823656] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Creating folder: Instances. Parent ref: group-v850328. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 716.824360] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3befc4ec-8329-4495-849f-b3452e7262d9 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.837030] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Created folder: Instances in parent group-v850328. [ 716.837937] env[60548]: DEBUG oslo.service.loopingcall [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 716.838210] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 716.838439] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ead51398-d710-4a98-923c-f7da18db4799 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 716.861965] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 716.861965] env[60548]: value = "task-4323333" [ 716.861965] env[60548]: _type = "Task" [ 716.861965] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 716.871528] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323333, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 716.990885] env[60548]: WARNING oslo_vmware.rw_handles [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 716.990885] env[60548]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 716.990885] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 716.990885] env[60548]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 716.990885] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 716.990885] env[60548]: ERROR oslo_vmware.rw_handles response.begin() [ 716.990885] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 716.990885] env[60548]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 716.990885] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 716.990885] env[60548]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 716.990885] env[60548]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 716.990885] env[60548]: ERROR oslo_vmware.rw_handles [ 716.991277] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Downloaded image file data 5674e50f-0c0c-4f19-8379-104dac34660b to vmware_temp/c2356668-dd2b-406b-8371-5513bab1f2d9/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore2 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 716.992786] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Caching image {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 716.993027] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Copying Virtual Disk [datastore2] vmware_temp/c2356668-dd2b-406b-8371-5513bab1f2d9/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk to [datastore2] vmware_temp/c2356668-dd2b-406b-8371-5513bab1f2d9/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk {{(pid=60548) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 716.993338] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-16a88989-7fb5-451a-bc48-dbbb12d4ef5b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.002514] env[60548]: DEBUG oslo_vmware.api [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Waiting for the task: (returnval){ [ 717.002514] env[60548]: value = "task-4323334" [ 717.002514] env[60548]: _type = "Task" [ 717.002514] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 717.017784] env[60548]: DEBUG oslo_vmware.api [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Task: {'id': task-4323334, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 717.373169] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323333, 'name': CreateVM_Task, 'duration_secs': 0.332128} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 717.373423] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 717.374373] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 717.374373] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 717.374725] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 717.375075] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-63244a5c-246b-4d7b-a4e9-5328c966cff5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.380878] env[60548]: DEBUG oslo_vmware.api [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Waiting for the task: (returnval){ [ 717.380878] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52baa285-f0d1-39c6-d9dc-bc25fd9fe01a" [ 717.380878] env[60548]: _type = "Task" [ 717.380878] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 717.389994] env[60548]: DEBUG oslo_vmware.api [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52baa285-f0d1-39c6-d9dc-bc25fd9fe01a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 717.513839] env[60548]: DEBUG oslo_vmware.exceptions [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Fault InvalidArgument not matched. {{(pid=60548) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 717.514237] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Releasing lock "[datastore2] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 717.514774] env[60548]: ERROR nova.compute.manager [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 717.514774] env[60548]: Faults: ['InvalidArgument'] [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Traceback (most recent call last): [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] yield resources [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] self.driver.spawn(context, instance, image_meta, [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] self._fetch_image_if_missing(context, vi) [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] image_cache(vi, tmp_image_ds_loc) [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] vm_util.copy_virtual_disk( [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] session._wait_for_task(vmdk_copy_task) [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] return self.wait_for_task(task_ref) [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] return evt.wait() [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] result = hub.switch() [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] return self.greenlet.switch() [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] self.f(*self.args, **self.kw) [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] raise exceptions.translate_fault(task_info.error) [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Faults: ['InvalidArgument'] [ 717.514774] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] [ 717.515590] env[60548]: INFO nova.compute.manager [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Terminating instance [ 717.517758] env[60548]: DEBUG nova.compute.manager [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 717.518022] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 717.519050] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5ba1e59-6ad1-4151-855c-56e998aa8e7d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.527464] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 717.527776] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ebb26aa6-7867-4ef9-a0d7-4772be5f88d9 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.599991] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 717.602028] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Deleting contents of the VM from datastore datastore2 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 717.602028] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Deleting the datastore file [datastore2] 8f10776c-4124-48fb-9135-d674986d4ad3 {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 717.602028] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-716942d5-9c87-4da0-95e1-5f19f26f6fcf {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 717.609530] env[60548]: DEBUG oslo_vmware.api [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Waiting for the task: (returnval){ [ 717.609530] env[60548]: value = "task-4323336" [ 717.609530] env[60548]: _type = "Task" [ 717.609530] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 717.619437] env[60548]: DEBUG oslo_vmware.api [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Task: {'id': task-4323336, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 717.893035] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 717.893349] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 717.893600] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 718.119872] env[60548]: DEBUG oslo_vmware.api [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Task: {'id': task-4323336, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070423} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 718.120150] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 718.120368] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Deleted contents of the VM from datastore datastore2 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 718.121030] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 718.121030] env[60548]: INFO nova.compute.manager [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Took 0.60 seconds to destroy the instance on the hypervisor. [ 718.124245] env[60548]: DEBUG nova.compute.claims [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 718.124414] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.124622] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.357344] env[60548]: DEBUG nova.compute.manager [req-25da6f4d-9f7d-4b76-9ead-93dd0fee0ac5 req-5e6aba26-1ba0-4eaf-8fdc-9169e13cf2e3 service nova] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Received event network-changed-7696794d-4c5d-406f-b4e5-b0e3e34a7019 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 718.357686] env[60548]: DEBUG nova.compute.manager [req-25da6f4d-9f7d-4b76-9ead-93dd0fee0ac5 req-5e6aba26-1ba0-4eaf-8fdc-9169e13cf2e3 service nova] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Refreshing instance network info cache due to event network-changed-7696794d-4c5d-406f-b4e5-b0e3e34a7019. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 718.358032] env[60548]: DEBUG oslo_concurrency.lockutils [req-25da6f4d-9f7d-4b76-9ead-93dd0fee0ac5 req-5e6aba26-1ba0-4eaf-8fdc-9169e13cf2e3 service nova] Acquiring lock "refresh_cache-2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 718.358347] env[60548]: DEBUG oslo_concurrency.lockutils [req-25da6f4d-9f7d-4b76-9ead-93dd0fee0ac5 req-5e6aba26-1ba0-4eaf-8fdc-9169e13cf2e3 service nova] Acquired lock "refresh_cache-2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 718.358557] env[60548]: DEBUG nova.network.neutron [req-25da6f4d-9f7d-4b76-9ead-93dd0fee0ac5 req-5e6aba26-1ba0-4eaf-8fdc-9169e13cf2e3 service nova] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Refreshing network info cache for port 7696794d-4c5d-406f-b4e5-b0e3e34a7019 {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 718.519960] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e73788fb-509a-4ab9-8862-7438d8e206c7 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.531271] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b50cc77-1eae-45d1-bc08-2310f7b27f35 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.561735] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d921f92-f414-48eb-99ef-d66e323c5f1c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.570029] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e723679-a74f-484d-8f23-3a486328236e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.583861] env[60548]: DEBUG nova.compute.provider_tree [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 718.593186] env[60548]: DEBUG nova.scheduler.client.report [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 718.612510] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.488s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 718.613085] env[60548]: ERROR nova.compute.manager [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 718.613085] env[60548]: Faults: ['InvalidArgument'] [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Traceback (most recent call last): [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] self.driver.spawn(context, instance, image_meta, [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] self._fetch_image_if_missing(context, vi) [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] image_cache(vi, tmp_image_ds_loc) [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] vm_util.copy_virtual_disk( [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] session._wait_for_task(vmdk_copy_task) [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] return self.wait_for_task(task_ref) [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] return evt.wait() [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] result = hub.switch() [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] return self.greenlet.switch() [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] self.f(*self.args, **self.kw) [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] raise exceptions.translate_fault(task_info.error) [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Faults: ['InvalidArgument'] [ 718.613085] env[60548]: ERROR nova.compute.manager [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] [ 718.613999] env[60548]: DEBUG nova.compute.utils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] VimFaultException {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 718.615311] env[60548]: DEBUG nova.compute.manager [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Build of instance 8f10776c-4124-48fb-9135-d674986d4ad3 was re-scheduled: A specified parameter was not correct: fileType [ 718.615311] env[60548]: Faults: ['InvalidArgument'] {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 718.615712] env[60548]: DEBUG nova.compute.manager [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 718.615892] env[60548]: DEBUG nova.compute.manager [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 718.616074] env[60548]: DEBUG nova.compute.manager [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 718.616239] env[60548]: DEBUG nova.network.neutron [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 718.803507] env[60548]: DEBUG nova.network.neutron [req-25da6f4d-9f7d-4b76-9ead-93dd0fee0ac5 req-5e6aba26-1ba0-4eaf-8fdc-9169e13cf2e3 service nova] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Updated VIF entry in instance network info cache for port 7696794d-4c5d-406f-b4e5-b0e3e34a7019. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 718.803931] env[60548]: DEBUG nova.network.neutron [req-25da6f4d-9f7d-4b76-9ead-93dd0fee0ac5 req-5e6aba26-1ba0-4eaf-8fdc-9169e13cf2e3 service nova] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Updating instance_info_cache with network_info: [{"id": "7696794d-4c5d-406f-b4e5-b0e3e34a7019", "address": "fa:16:3e:93:c6:11", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.160", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7696794d-4c", "ovs_interfaceid": "7696794d-4c5d-406f-b4e5-b0e3e34a7019", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 718.816981] env[60548]: DEBUG oslo_concurrency.lockutils [req-25da6f4d-9f7d-4b76-9ead-93dd0fee0ac5 req-5e6aba26-1ba0-4eaf-8fdc-9169e13cf2e3 service nova] Releasing lock "refresh_cache-2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 719.067032] env[60548]: DEBUG nova.network.neutron [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 719.076695] env[60548]: INFO nova.compute.manager [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] [instance: 8f10776c-4124-48fb-9135-d674986d4ad3] Took 0.46 seconds to deallocate network for instance. [ 719.179830] env[60548]: INFO nova.scheduler.client.report [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Deleted allocations for instance 8f10776c-4124-48fb-9135-d674986d4ad3 [ 719.200572] env[60548]: DEBUG oslo_concurrency.lockutils [None req-67513f80-bc5a-4a37-99dc-57f59becc98d tempest-AttachVolumeShelveTestJSON-723093209 tempest-AttachVolumeShelveTestJSON-723093209-project-member] Lock "8f10776c-4124-48fb-9135-d674986d4ad3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 69.500s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 719.216080] env[60548]: DEBUG nova.compute.manager [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 719.269555] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 719.269799] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 719.271262] env[60548]: INFO nova.compute.claims [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 719.616537] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4283b7cb-3b59-4a4c-9c94-8a4e11cfec18 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.624822] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ea5e7b7-a7a2-4376-bf28-0fe0392e0d7c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.654690] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62deabb3-7f56-47ed-b07d-8b41c8120adc {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.662614] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9a80db9-b968-4dc1-a255-1afdf2fbe4e6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.678190] env[60548]: DEBUG nova.compute.provider_tree [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 719.686789] env[60548]: DEBUG nova.scheduler.client.report [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 719.706459] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.437s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 719.706975] env[60548]: DEBUG nova.compute.manager [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 719.740776] env[60548]: DEBUG nova.compute.utils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 719.743409] env[60548]: DEBUG nova.compute.manager [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 719.743627] env[60548]: DEBUG nova.network.neutron [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 719.754848] env[60548]: DEBUG nova.compute.manager [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 719.823147] env[60548]: DEBUG nova.compute.manager [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 719.838226] env[60548]: DEBUG nova.policy [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7a25f2f313d042e7a5669e7646f2a6bd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '14b6fd8340554fe6b476f6958396a650', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 719.842156] env[60548]: DEBUG nova.virt.hardware [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 719.842411] env[60548]: DEBUG nova.virt.hardware [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 719.842615] env[60548]: DEBUG nova.virt.hardware [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 719.842736] env[60548]: DEBUG nova.virt.hardware [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 719.842880] env[60548]: DEBUG nova.virt.hardware [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 719.843050] env[60548]: DEBUG nova.virt.hardware [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 719.843280] env[60548]: DEBUG nova.virt.hardware [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 719.843441] env[60548]: DEBUG nova.virt.hardware [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 719.843618] env[60548]: DEBUG nova.virt.hardware [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 719.843794] env[60548]: DEBUG nova.virt.hardware [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 719.843969] env[60548]: DEBUG nova.virt.hardware [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 719.845291] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7168887a-e671-4a6e-813d-ce2be2d94b9a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.853948] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06643b78-483e-4c92-9931-b2a331430b7f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.329051] env[60548]: DEBUG nova.network.neutron [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Successfully created port: a2c5ed22-d705-4425-b72f-36792e06f5e7 {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 721.547791] env[60548]: DEBUG nova.network.neutron [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Successfully updated port: a2c5ed22-d705-4425-b72f-36792e06f5e7 {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 721.557771] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Acquiring lock "refresh_cache-be11788c-634f-40c0-8c8c-d6253d0e68ad" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 721.557928] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Acquired lock "refresh_cache-be11788c-634f-40c0-8c8c-d6253d0e68ad" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 721.558541] env[60548]: DEBUG nova.network.neutron [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 721.625476] env[60548]: DEBUG nova.network.neutron [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 721.947337] env[60548]: DEBUG nova.network.neutron [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Updating instance_info_cache with network_info: [{"id": "a2c5ed22-d705-4425-b72f-36792e06f5e7", "address": "fa:16:3e:71:ca:ee", "network": {"id": "5afc36cf-8d60-4853-908d-19cd329fc59c", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-118094748-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "14b6fd8340554fe6b476f6958396a650", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7654928b-7afe-42e3-a18d-68ecc775cefe", "external-id": "cl2-zone-807", "segmentation_id": 807, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa2c5ed22-d7", "ovs_interfaceid": "a2c5ed22-d705-4425-b72f-36792e06f5e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 721.965452] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Releasing lock "refresh_cache-be11788c-634f-40c0-8c8c-d6253d0e68ad" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 721.965574] env[60548]: DEBUG nova.compute.manager [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Instance network_info: |[{"id": "a2c5ed22-d705-4425-b72f-36792e06f5e7", "address": "fa:16:3e:71:ca:ee", "network": {"id": "5afc36cf-8d60-4853-908d-19cd329fc59c", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-118094748-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "14b6fd8340554fe6b476f6958396a650", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7654928b-7afe-42e3-a18d-68ecc775cefe", "external-id": "cl2-zone-807", "segmentation_id": 807, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa2c5ed22-d7", "ovs_interfaceid": "a2c5ed22-d705-4425-b72f-36792e06f5e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 721.965956] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:71:ca:ee', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7654928b-7afe-42e3-a18d-68ecc775cefe', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a2c5ed22-d705-4425-b72f-36792e06f5e7', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 721.973219] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Creating folder: Project (14b6fd8340554fe6b476f6958396a650). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 721.973817] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f3f18479-caee-4431-ba63-a88b138ef878 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.986053] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Created folder: Project (14b6fd8340554fe6b476f6958396a650) in parent group-v850287. [ 721.986269] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Creating folder: Instances. Parent ref: group-v850331. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 721.988753] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e18b740c-7844-4efb-9df8-eaab0639ec77 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 721.999882] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Created folder: Instances in parent group-v850331. [ 722.000299] env[60548]: DEBUG oslo.service.loopingcall [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 722.000371] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 722.000570] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8b6275eb-4611-4295-8db8-39dd9d92e141 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.025263] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 722.025263] env[60548]: value = "task-4323339" [ 722.025263] env[60548]: _type = "Task" [ 722.025263] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 722.034905] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323339, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 722.536587] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323339, 'name': CreateVM_Task, 'duration_secs': 0.320025} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 722.536806] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 722.537555] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 722.537711] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 722.538131] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 722.538295] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-82f40fef-abe2-4cc1-bb7a-ee4fc73433cd {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.543246] env[60548]: DEBUG oslo_vmware.api [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Waiting for the task: (returnval){ [ 722.543246] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]520f1607-781b-2277-0a1e-c48b673f996d" [ 722.543246] env[60548]: _type = "Task" [ 722.543246] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 722.553030] env[60548]: DEBUG oslo_vmware.api [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]520f1607-781b-2277-0a1e-c48b673f996d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 723.056851] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 723.057161] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 723.057386] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 723.079498] env[60548]: DEBUG nova.compute.manager [req-c18a2985-97da-4891-9f90-509785b64a38 req-d1552583-8a56-4645-9838-bc671f72a13f service nova] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Received event network-vif-plugged-a2c5ed22-d705-4425-b72f-36792e06f5e7 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 723.079618] env[60548]: DEBUG oslo_concurrency.lockutils [req-c18a2985-97da-4891-9f90-509785b64a38 req-d1552583-8a56-4645-9838-bc671f72a13f service nova] Acquiring lock "be11788c-634f-40c0-8c8c-d6253d0e68ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 723.081120] env[60548]: DEBUG oslo_concurrency.lockutils [req-c18a2985-97da-4891-9f90-509785b64a38 req-d1552583-8a56-4645-9838-bc671f72a13f service nova] Lock "be11788c-634f-40c0-8c8c-d6253d0e68ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 723.081120] env[60548]: DEBUG oslo_concurrency.lockutils [req-c18a2985-97da-4891-9f90-509785b64a38 req-d1552583-8a56-4645-9838-bc671f72a13f service nova] Lock "be11788c-634f-40c0-8c8c-d6253d0e68ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 723.081120] env[60548]: DEBUG nova.compute.manager [req-c18a2985-97da-4891-9f90-509785b64a38 req-d1552583-8a56-4645-9838-bc671f72a13f service nova] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] No waiting events found dispatching network-vif-plugged-a2c5ed22-d705-4425-b72f-36792e06f5e7 {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 723.081120] env[60548]: WARNING nova.compute.manager [req-c18a2985-97da-4891-9f90-509785b64a38 req-d1552583-8a56-4645-9838-bc671f72a13f service nova] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Received unexpected event network-vif-plugged-a2c5ed22-d705-4425-b72f-36792e06f5e7 for instance with vm_state building and task_state spawning. [ 723.081120] env[60548]: DEBUG nova.compute.manager [req-c18a2985-97da-4891-9f90-509785b64a38 req-d1552583-8a56-4645-9838-bc671f72a13f service nova] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Received event network-changed-a2c5ed22-d705-4425-b72f-36792e06f5e7 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 723.081120] env[60548]: DEBUG nova.compute.manager [req-c18a2985-97da-4891-9f90-509785b64a38 req-d1552583-8a56-4645-9838-bc671f72a13f service nova] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Refreshing instance network info cache due to event network-changed-a2c5ed22-d705-4425-b72f-36792e06f5e7. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 723.081120] env[60548]: DEBUG oslo_concurrency.lockutils [req-c18a2985-97da-4891-9f90-509785b64a38 req-d1552583-8a56-4645-9838-bc671f72a13f service nova] Acquiring lock "refresh_cache-be11788c-634f-40c0-8c8c-d6253d0e68ad" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 723.081623] env[60548]: DEBUG oslo_concurrency.lockutils [req-c18a2985-97da-4891-9f90-509785b64a38 req-d1552583-8a56-4645-9838-bc671f72a13f service nova] Acquired lock "refresh_cache-be11788c-634f-40c0-8c8c-d6253d0e68ad" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 723.081623] env[60548]: DEBUG nova.network.neutron [req-c18a2985-97da-4891-9f90-509785b64a38 req-d1552583-8a56-4645-9838-bc671f72a13f service nova] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Refreshing network info cache for port a2c5ed22-d705-4425-b72f-36792e06f5e7 {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 723.473981] env[60548]: DEBUG nova.network.neutron [req-c18a2985-97da-4891-9f90-509785b64a38 req-d1552583-8a56-4645-9838-bc671f72a13f service nova] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Updated VIF entry in instance network info cache for port a2c5ed22-d705-4425-b72f-36792e06f5e7. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 723.473981] env[60548]: DEBUG nova.network.neutron [req-c18a2985-97da-4891-9f90-509785b64a38 req-d1552583-8a56-4645-9838-bc671f72a13f service nova] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Updating instance_info_cache with network_info: [{"id": "a2c5ed22-d705-4425-b72f-36792e06f5e7", "address": "fa:16:3e:71:ca:ee", "network": {"id": "5afc36cf-8d60-4853-908d-19cd329fc59c", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-118094748-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "14b6fd8340554fe6b476f6958396a650", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7654928b-7afe-42e3-a18d-68ecc775cefe", "external-id": "cl2-zone-807", "segmentation_id": 807, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa2c5ed22-d7", "ovs_interfaceid": "a2c5ed22-d705-4425-b72f-36792e06f5e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 723.488286] env[60548]: DEBUG oslo_concurrency.lockutils [req-c18a2985-97da-4891-9f90-509785b64a38 req-d1552583-8a56-4645-9838-bc671f72a13f service nova] Releasing lock "refresh_cache-be11788c-634f-40c0-8c8c-d6253d0e68ad" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 747.167131] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 747.191079] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 748.172258] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 748.172513] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 748.172645] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 748.182824] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 748.185670] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 748.185670] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 748.185670] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60548) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 748.185670] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50055dbe-6823-4738-810a-5b9544acb2b5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.194241] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10e47cbd-3510-4a82-9934-4a650bd45ac8 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.208545] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64511867-2066-4804-a343-7df1d50f1c72 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.215842] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76da4b98-db86-4ed1-9b13-f84f21df06d3 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.246342] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180690MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60548) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 748.246508] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 748.246709] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 748.311696] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 3a7076f4-fc00-4f82-804b-4dac0de9ab3d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 748.311871] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 9a14b9d0-876b-45c6-825e-103caac6bef9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 748.312005] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance b2a1b0c6-91cd-437d-b5bf-ca618b008c23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 748.312136] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance b14d8d3f-9253-4e5f-a5b8-09a2f43888a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 748.312255] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 386edc81-5f27-4e44-af7a-f5e47ded1327 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 748.312372] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 2751bdfb-2f28-48e0-98c2-f232ed6da6df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 748.312487] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 83ecd8bb-ba2b-4151-986b-26f50b54e8e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 748.312603] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance afb2cdc1-74ec-4d08-85cb-e96b4071f661 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 748.312717] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 748.312831] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance be11788c-634f-40c0-8c8c-d6253d0e68ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 748.323535] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 306f3cb9-3028-4ff2-8090-2c9c1c72efc1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 748.334194] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 46737200-2da8-41ee-b33e-3bb6cc3e4618 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 748.343984] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 67cbaf5c-e743-4e07-8f74-c51e4f57914d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 748.354025] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 748.363311] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance a878722d-7e36-4f15-8c5f-bd473375dd9b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 748.374507] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance deffd52b-d708-4c46-a168-18e80b05b133 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 748.385455] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance bf1694f2-6ad0-4e15-b05d-c73c24e0e955 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 748.399097] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance ecc4262d-6133-4541-aec8-fbee05180701 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 748.410361] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance c1b6b578-1bed-4e6c-8e7a-34c2e469cd80 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 748.421243] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance fd3e6440-74fc-4425-9b7e-571245ddc379 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 748.432027] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance fbf4b0bc-d87a-4063-87ef-e44f72ecbf5c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 748.442576] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 81c71aa0-9c68-407a-9bef-708c2cb70b12 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 748.454189] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance bdae41ee-e9c3-4272-9623-ca88464ec45a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 748.465858] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 3a0515f1-e61a-48d4-980d-49c7189dca2d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 748.466159] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 748.466345] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=100GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 748.781105] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81ea3e8a-07f6-46f6-a3d4-22d4c2a338e3 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.789275] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0571275b-5af7-49c4-a2a2-ee5f3b5dbedf {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.819458] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6fe11ea-b341-4749-8284-add38dace808 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.827099] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79b09b9a-af1c-458b-8931-c00cd999eb7c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 748.840163] env[60548]: DEBUG nova.compute.provider_tree [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 748.848734] env[60548]: DEBUG nova.scheduler.client.report [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 748.861589] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60548) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 748.861769] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 749.856457] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 749.856829] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 749.856879] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 749.856991] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60548) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 750.172477] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 750.172678] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Starting heal instance info cache {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 750.172805] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Rebuilding the list of instances to heal {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 750.194499] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.194685] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.194806] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.194938] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.195315] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.195482] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.195610] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.195732] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.195853] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.196044] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 750.196186] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Didn't find any instances for network info cache update. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 750.196719] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 755.066122] env[60548]: DEBUG nova.compute.manager [req-2f7166d5-32b5-4d18-8222-f5db01c0a9ba req-277e6587-b402-4a13-b5af-507e44b1a5a8 service nova] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Received event network-vif-deleted-fe9b78f9-9aa4-482f-8d36-4b5c359f7121 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 755.801448] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquiring lock "30cf201d-7a1c-479c-9040-fba38726d9ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 755.801731] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Lock "30cf201d-7a1c-479c-9040-fba38726d9ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 762.006084] env[60548]: WARNING oslo_vmware.rw_handles [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 762.006084] env[60548]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 762.006084] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 762.006084] env[60548]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 762.006084] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 762.006084] env[60548]: ERROR oslo_vmware.rw_handles response.begin() [ 762.006084] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 762.006084] env[60548]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 762.006084] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 762.006084] env[60548]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 762.006084] env[60548]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 762.006084] env[60548]: ERROR oslo_vmware.rw_handles [ 762.006084] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Downloaded image file data 5674e50f-0c0c-4f19-8379-104dac34660b to vmware_temp/30d80ae1-bdff-46bc-a12f-c0ddf084f4c2/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 762.007589] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Caching image {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 762.007835] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Copying Virtual Disk [datastore1] vmware_temp/30d80ae1-bdff-46bc-a12f-c0ddf084f4c2/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk to [datastore1] vmware_temp/30d80ae1-bdff-46bc-a12f-c0ddf084f4c2/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk {{(pid=60548) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 762.008118] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-192859ae-a1d4-4c01-9d91-4c15e96158a1 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.018754] env[60548]: DEBUG oslo_vmware.api [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Waiting for the task: (returnval){ [ 762.018754] env[60548]: value = "task-4323351" [ 762.018754] env[60548]: _type = "Task" [ 762.018754] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 762.025879] env[60548]: DEBUG oslo_vmware.api [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Task: {'id': task-4323351, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 762.529020] env[60548]: DEBUG oslo_vmware.exceptions [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Fault InvalidArgument not matched. {{(pid=60548) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 762.529302] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 762.529925] env[60548]: ERROR nova.compute.manager [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 762.529925] env[60548]: Faults: ['InvalidArgument'] [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Traceback (most recent call last): [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] yield resources [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] self.driver.spawn(context, instance, image_meta, [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] self._fetch_image_if_missing(context, vi) [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] image_cache(vi, tmp_image_ds_loc) [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] vm_util.copy_virtual_disk( [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] session._wait_for_task(vmdk_copy_task) [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] return self.wait_for_task(task_ref) [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] return evt.wait() [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] result = hub.switch() [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] return self.greenlet.switch() [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] self.f(*self.args, **self.kw) [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] raise exceptions.translate_fault(task_info.error) [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Faults: ['InvalidArgument'] [ 762.529925] env[60548]: ERROR nova.compute.manager [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] [ 762.530890] env[60548]: INFO nova.compute.manager [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Terminating instance [ 762.532038] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 762.532151] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 762.532321] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7ab8e142-a1b7-4b6e-a198-8cf6045b52da {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.534646] env[60548]: DEBUG nova.compute.manager [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 762.534840] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 762.535593] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8eb89df-9182-4a65-9a98-2f5bb747c737 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.544490] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 762.544744] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c8731449-82a8-4f5e-a232-489619011b37 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.547344] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 762.547344] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 762.549339] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-58b8fd19-1d75-45ca-9d01-12cd045bee99 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.554928] env[60548]: DEBUG oslo_vmware.api [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Waiting for the task: (returnval){ [ 762.554928] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]524f9415-b254-69b5-076d-c479cdba6936" [ 762.554928] env[60548]: _type = "Task" [ 762.554928] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 762.563106] env[60548]: DEBUG oslo_vmware.api [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]524f9415-b254-69b5-076d-c479cdba6936, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 762.620764] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 762.620764] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 762.620937] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Deleting the datastore file [datastore1] 3a7076f4-fc00-4f82-804b-4dac0de9ab3d {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 762.621163] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-117a7434-54fb-4c7b-8c42-8ca2e4db4efc {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 762.630206] env[60548]: DEBUG oslo_vmware.api [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Waiting for the task: (returnval){ [ 762.630206] env[60548]: value = "task-4323353" [ 762.630206] env[60548]: _type = "Task" [ 762.630206] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 762.636761] env[60548]: DEBUG oslo_vmware.api [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Task: {'id': task-4323353, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 762.719858] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d54fbeb9-98d7-4ea2-ac90-398d9ffb9b8c tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Acquiring lock "9a14b9d0-876b-45c6-825e-103caac6bef9" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 763.065827] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 763.066232] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Creating directory with path [datastore1] vmware_temp/a118e48d-cb71-4616-9205-8081fce42d0e/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 763.066547] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-12a01c25-99c8-47ca-8598-d4afbb04ff12 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.078437] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Created directory with path [datastore1] vmware_temp/a118e48d-cb71-4616-9205-8081fce42d0e/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 763.078725] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Fetch image to [datastore1] vmware_temp/a118e48d-cb71-4616-9205-8081fce42d0e/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 763.078911] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/a118e48d-cb71-4616-9205-8081fce42d0e/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 763.079660] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bc40b62-7ae8-4049-99ce-51b7cb0f4ace {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.087192] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdd062a6-0c4a-40c4-bcad-1b921512b5e7 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.097370] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9f4c5c1-b155-420f-97cf-bfedd4b290a1 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.127567] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10113d80-420f-417e-be70-28028839818b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.140064] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d03e980a-c9c0-4af9-8648-27c1aef977eb {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.142026] env[60548]: DEBUG oslo_vmware.api [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Task: {'id': task-4323353, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072821} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 763.142273] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 763.142453] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 763.142619] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 763.142786] env[60548]: INFO nova.compute.manager [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Took 0.61 seconds to destroy the instance on the hypervisor. [ 763.144942] env[60548]: DEBUG nova.compute.claims [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 763.145097] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 763.145359] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 763.175091] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.030s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 763.175833] env[60548]: DEBUG nova.compute.utils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Instance 3a7076f4-fc00-4f82-804b-4dac0de9ab3d could not be found. {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 763.177477] env[60548]: DEBUG nova.compute.manager [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Instance disappeared during build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 763.177630] env[60548]: DEBUG nova.compute.manager [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 763.177814] env[60548]: DEBUG nova.compute.manager [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 763.178013] env[60548]: DEBUG nova.compute.manager [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 763.178186] env[60548]: DEBUG nova.network.neutron [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 763.205491] env[60548]: DEBUG nova.network.neutron [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 763.215225] env[60548]: INFO nova.compute.manager [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Took 0.04 seconds to deallocate network for instance. [ 763.233329] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 763.276917] env[60548]: DEBUG oslo_concurrency.lockutils [None req-943340a9-8bfb-497d-894d-b41b7d054998 tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Lock "3a7076f4-fc00-4f82-804b-4dac0de9ab3d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.041s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 763.282426] env[60548]: DEBUG oslo_vmware.rw_handles [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a118e48d-cb71-4616-9205-8081fce42d0e/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 763.337203] env[60548]: DEBUG nova.compute.manager [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 763.341899] env[60548]: DEBUG oslo_vmware.rw_handles [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Completed reading data from the image iterator. {{(pid=60548) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 763.342091] env[60548]: DEBUG oslo_vmware.rw_handles [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a118e48d-cb71-4616-9205-8081fce42d0e/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 763.386836] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 763.386836] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 763.388222] env[60548]: INFO nova.compute.claims [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 763.734570] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10f98f80-ac15-4393-a37f-8e9faea8ce95 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.742939] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d630ebb9-c68d-45c7-aa44-ac3168a39095 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.773843] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be8d1b35-9cc8-4ab3-b1cf-2074c829086f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.781826] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-655b71a7-fb68-41c0-ab4f-4b5998bfae49 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.796184] env[60548]: DEBUG nova.compute.provider_tree [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 763.804808] env[60548]: DEBUG nova.scheduler.client.report [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 763.818276] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.431s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 763.818735] env[60548]: DEBUG nova.compute.manager [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 763.853983] env[60548]: DEBUG nova.compute.utils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 763.855775] env[60548]: DEBUG nova.compute.manager [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 763.855950] env[60548]: DEBUG nova.network.neutron [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 763.864378] env[60548]: DEBUG nova.compute.manager [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 763.929860] env[60548]: DEBUG nova.compute.manager [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 763.954236] env[60548]: DEBUG nova.virt.hardware [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 763.954559] env[60548]: DEBUG nova.virt.hardware [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 763.954735] env[60548]: DEBUG nova.virt.hardware [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 763.954918] env[60548]: DEBUG nova.virt.hardware [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 763.955154] env[60548]: DEBUG nova.virt.hardware [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 763.955250] env[60548]: DEBUG nova.virt.hardware [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 763.955417] env[60548]: DEBUG nova.virt.hardware [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 763.955572] env[60548]: DEBUG nova.virt.hardware [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 763.955733] env[60548]: DEBUG nova.virt.hardware [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 763.955889] env[60548]: DEBUG nova.virt.hardware [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 763.956194] env[60548]: DEBUG nova.virt.hardware [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 763.957236] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7eaf8813-5990-4c01-b9cd-f80517392209 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.965716] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5047b026-ad59-4d4b-bed3-74007517cbc8 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 763.987177] env[60548]: DEBUG nova.policy [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a55618f8417d4926a3441e2620494eb0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cce08d1ae47c4fc1b1f01412cac916bd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 764.714220] env[60548]: DEBUG nova.network.neutron [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Successfully created port: fa3ddc74-9aa4-416d-b6c8-299e875a9e8d {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 765.176463] env[60548]: DEBUG oslo_concurrency.lockutils [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Acquiring lock "b2a1b0c6-91cd-437d-b5bf-ca618b008c23" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 765.818242] env[60548]: DEBUG nova.network.neutron [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Successfully updated port: fa3ddc74-9aa4-416d-b6c8-299e875a9e8d {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 765.826437] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquiring lock "refresh_cache-306f3cb9-3028-4ff2-8090-2c9c1c72efc1" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 765.826575] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquired lock "refresh_cache-306f3cb9-3028-4ff2-8090-2c9c1c72efc1" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 765.826723] env[60548]: DEBUG nova.network.neutron [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 765.846173] env[60548]: DEBUG nova.compute.manager [req-ac9f04c1-5c68-4a72-bb9c-45774bf73413 req-98e8f33e-c2e7-43d5-8d06-b8578ed8807c service nova] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Received event network-vif-plugged-fa3ddc74-9aa4-416d-b6c8-299e875a9e8d {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 765.846486] env[60548]: DEBUG oslo_concurrency.lockutils [req-ac9f04c1-5c68-4a72-bb9c-45774bf73413 req-98e8f33e-c2e7-43d5-8d06-b8578ed8807c service nova] Acquiring lock "306f3cb9-3028-4ff2-8090-2c9c1c72efc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 765.846925] env[60548]: DEBUG oslo_concurrency.lockutils [req-ac9f04c1-5c68-4a72-bb9c-45774bf73413 req-98e8f33e-c2e7-43d5-8d06-b8578ed8807c service nova] Lock "306f3cb9-3028-4ff2-8090-2c9c1c72efc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 765.847112] env[60548]: DEBUG oslo_concurrency.lockutils [req-ac9f04c1-5c68-4a72-bb9c-45774bf73413 req-98e8f33e-c2e7-43d5-8d06-b8578ed8807c service nova] Lock "306f3cb9-3028-4ff2-8090-2c9c1c72efc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 765.847283] env[60548]: DEBUG nova.compute.manager [req-ac9f04c1-5c68-4a72-bb9c-45774bf73413 req-98e8f33e-c2e7-43d5-8d06-b8578ed8807c service nova] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] No waiting events found dispatching network-vif-plugged-fa3ddc74-9aa4-416d-b6c8-299e875a9e8d {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 765.847524] env[60548]: WARNING nova.compute.manager [req-ac9f04c1-5c68-4a72-bb9c-45774bf73413 req-98e8f33e-c2e7-43d5-8d06-b8578ed8807c service nova] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Received unexpected event network-vif-plugged-fa3ddc74-9aa4-416d-b6c8-299e875a9e8d for instance with vm_state building and task_state spawning. [ 765.911648] env[60548]: DEBUG nova.network.neutron [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 766.233374] env[60548]: DEBUG nova.network.neutron [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Updating instance_info_cache with network_info: [{"id": "fa3ddc74-9aa4-416d-b6c8-299e875a9e8d", "address": "fa:16:3e:e9:31:a7", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.119", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfa3ddc74-9a", "ovs_interfaceid": "fa3ddc74-9aa4-416d-b6c8-299e875a9e8d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 766.244951] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Releasing lock "refresh_cache-306f3cb9-3028-4ff2-8090-2c9c1c72efc1" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 766.245295] env[60548]: DEBUG nova.compute.manager [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Instance network_info: |[{"id": "fa3ddc74-9aa4-416d-b6c8-299e875a9e8d", "address": "fa:16:3e:e9:31:a7", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.119", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfa3ddc74-9a", "ovs_interfaceid": "fa3ddc74-9aa4-416d-b6c8-299e875a9e8d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 766.245662] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e9:31:a7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '15922696-dc08-44ef-97be-0b09a9dfeae8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'fa3ddc74-9aa4-416d-b6c8-299e875a9e8d', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 766.253398] env[60548]: DEBUG oslo.service.loopingcall [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 766.254072] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 766.254332] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0d454d06-a6a8-4239-bf2c-2e454491fb91 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.276575] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 766.276575] env[60548]: value = "task-4323354" [ 766.276575] env[60548]: _type = "Task" [ 766.276575] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 766.284746] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323354, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 766.788490] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323354, 'name': CreateVM_Task, 'duration_secs': 0.314257} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 766.788822] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 766.790308] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 766.790574] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 766.790988] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 766.791352] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b6573ac6-36f5-4f9d-9569-ce05fbb54ee1 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.796787] env[60548]: DEBUG oslo_vmware.api [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Waiting for the task: (returnval){ [ 766.796787] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]522f33ca-2db6-ced1-42b0-194a599cbddd" [ 766.796787] env[60548]: _type = "Task" [ 766.796787] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 766.809342] env[60548]: DEBUG oslo_vmware.api [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]522f33ca-2db6-ced1-42b0-194a599cbddd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 767.309071] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 767.309327] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 767.309539] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 767.940251] env[60548]: DEBUG nova.compute.manager [req-78f0f296-dda6-40e3-bd1c-3c726fa5ac72 req-9141eee9-8d37-4e3b-ae24-adbb65f24c88 service nova] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Received event network-changed-fa3ddc74-9aa4-416d-b6c8-299e875a9e8d {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 767.940251] env[60548]: DEBUG nova.compute.manager [req-78f0f296-dda6-40e3-bd1c-3c726fa5ac72 req-9141eee9-8d37-4e3b-ae24-adbb65f24c88 service nova] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Refreshing instance network info cache due to event network-changed-fa3ddc74-9aa4-416d-b6c8-299e875a9e8d. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 767.940251] env[60548]: DEBUG oslo_concurrency.lockutils [req-78f0f296-dda6-40e3-bd1c-3c726fa5ac72 req-9141eee9-8d37-4e3b-ae24-adbb65f24c88 service nova] Acquiring lock "refresh_cache-306f3cb9-3028-4ff2-8090-2c9c1c72efc1" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 767.940370] env[60548]: DEBUG oslo_concurrency.lockutils [req-78f0f296-dda6-40e3-bd1c-3c726fa5ac72 req-9141eee9-8d37-4e3b-ae24-adbb65f24c88 service nova] Acquired lock "refresh_cache-306f3cb9-3028-4ff2-8090-2c9c1c72efc1" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 767.940566] env[60548]: DEBUG nova.network.neutron [req-78f0f296-dda6-40e3-bd1c-3c726fa5ac72 req-9141eee9-8d37-4e3b-ae24-adbb65f24c88 service nova] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Refreshing network info cache for port fa3ddc74-9aa4-416d-b6c8-299e875a9e8d {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 768.472928] env[60548]: DEBUG nova.network.neutron [req-78f0f296-dda6-40e3-bd1c-3c726fa5ac72 req-9141eee9-8d37-4e3b-ae24-adbb65f24c88 service nova] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Updated VIF entry in instance network info cache for port fa3ddc74-9aa4-416d-b6c8-299e875a9e8d. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 768.473349] env[60548]: DEBUG nova.network.neutron [req-78f0f296-dda6-40e3-bd1c-3c726fa5ac72 req-9141eee9-8d37-4e3b-ae24-adbb65f24c88 service nova] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Updating instance_info_cache with network_info: [{"id": "fa3ddc74-9aa4-416d-b6c8-299e875a9e8d", "address": "fa:16:3e:e9:31:a7", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.119", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfa3ddc74-9a", "ovs_interfaceid": "fa3ddc74-9aa4-416d-b6c8-299e875a9e8d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 768.484758] env[60548]: DEBUG oslo_concurrency.lockutils [req-78f0f296-dda6-40e3-bd1c-3c726fa5ac72 req-9141eee9-8d37-4e3b-ae24-adbb65f24c88 service nova] Releasing lock "refresh_cache-306f3cb9-3028-4ff2-8090-2c9c1c72efc1" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 771.678471] env[60548]: DEBUG oslo_concurrency.lockutils [None req-bdab44a4-5913-468d-90cb-59a4ea100d42 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Acquiring lock "b14d8d3f-9253-4e5f-a5b8-09a2f43888a6" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 772.774117] env[60548]: DEBUG oslo_concurrency.lockutils [None req-271f3c6a-ba8d-498c-9457-307bb55b52e0 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Acquiring lock "386edc81-5f27-4e44-af7a-f5e47ded1327" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 806.171710] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 806.172016] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Cleaning up deleted instances {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 806.189599] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] There are 1 instances to clean {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 806.189935] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 3a7076f4-fc00-4f82-804b-4dac0de9ab3d] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 806.228119] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 806.228289] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Cleaning up deleted instances with incomplete migration {{(pid=60548) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 806.236916] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 807.242784] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 808.172712] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 808.783159] env[60548]: WARNING oslo_vmware.rw_handles [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 808.783159] env[60548]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 808.783159] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 808.783159] env[60548]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 808.783159] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 808.783159] env[60548]: ERROR oslo_vmware.rw_handles response.begin() [ 808.783159] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 808.783159] env[60548]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 808.783159] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 808.783159] env[60548]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 808.783159] env[60548]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 808.783159] env[60548]: ERROR oslo_vmware.rw_handles [ 808.783835] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Downloaded image file data 5674e50f-0c0c-4f19-8379-104dac34660b to vmware_temp/a118e48d-cb71-4616-9205-8081fce42d0e/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 808.785304] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Caching image {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 808.785547] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Copying Virtual Disk [datastore1] vmware_temp/a118e48d-cb71-4616-9205-8081fce42d0e/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk to [datastore1] vmware_temp/a118e48d-cb71-4616-9205-8081fce42d0e/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk {{(pid=60548) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 808.785823] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0a2eda90-f348-44a5-a3b1-1a317a6a8ddd {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 808.794738] env[60548]: DEBUG oslo_vmware.api [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Waiting for the task: (returnval){ [ 808.794738] env[60548]: value = "task-4323355" [ 808.794738] env[60548]: _type = "Task" [ 808.794738] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 808.803481] env[60548]: DEBUG oslo_vmware.api [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Task: {'id': task-4323355, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 809.171628] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 809.171955] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 809.172147] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60548) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 809.172312] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 809.181925] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 809.182161] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 809.182327] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 809.182480] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60548) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 809.183638] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a733bb1-f546-41f2-bf2a-143e68c137b2 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.192726] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e180196-e26d-41f5-b157-7fd0f9c85a31 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.208863] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-382d0716-050c-4c88-9dff-c96cd780f828 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.217028] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f5951b4-d501-4183-b347-fa40ba44c929 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.250054] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180668MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60548) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 809.250236] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 809.250536] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 809.310636] env[60548]: DEBUG oslo_vmware.exceptions [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Fault InvalidArgument not matched. {{(pid=60548) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 809.310849] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 809.311410] env[60548]: ERROR nova.compute.manager [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 809.311410] env[60548]: Faults: ['InvalidArgument'] [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Traceback (most recent call last): [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] yield resources [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] self.driver.spawn(context, instance, image_meta, [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] self._vmops.spawn(context, instance, image_meta, injected_files, [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] self._fetch_image_if_missing(context, vi) [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] image_cache(vi, tmp_image_ds_loc) [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] vm_util.copy_virtual_disk( [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] session._wait_for_task(vmdk_copy_task) [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] return self.wait_for_task(task_ref) [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] return evt.wait() [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] result = hub.switch() [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] return self.greenlet.switch() [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] self.f(*self.args, **self.kw) [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] raise exceptions.translate_fault(task_info.error) [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Faults: ['InvalidArgument'] [ 809.311410] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] [ 809.312320] env[60548]: INFO nova.compute.manager [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Terminating instance [ 809.313518] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 809.313929] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 809.314437] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Acquiring lock "refresh_cache-b2a1b0c6-91cd-437d-b5bf-ca618b008c23" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 809.314588] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Acquired lock "refresh_cache-b2a1b0c6-91cd-437d-b5bf-ca618b008c23" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 809.314751] env[60548]: DEBUG nova.network.neutron [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 809.315797] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-445bfd6c-4541-4888-a966-fce548971923 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.325513] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 809.325703] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 809.327470] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 9a14b9d0-876b-45c6-825e-103caac6bef9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 809.327633] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance b2a1b0c6-91cd-437d-b5bf-ca618b008c23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 809.327764] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance b14d8d3f-9253-4e5f-a5b8-09a2f43888a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 809.327887] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 386edc81-5f27-4e44-af7a-f5e47ded1327 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 809.328013] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 2751bdfb-2f28-48e0-98c2-f232ed6da6df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 809.328137] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 83ecd8bb-ba2b-4151-986b-26f50b54e8e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 809.328252] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance afb2cdc1-74ec-4d08-85cb-e96b4071f661 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 809.328368] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 809.328473] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance be11788c-634f-40c0-8c8c-d6253d0e68ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 809.328583] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 306f3cb9-3028-4ff2-8090-2c9c1c72efc1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 809.330376] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-577b3edb-e904-4001-ad79-3367d73f45b3 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.340127] env[60548]: DEBUG oslo_vmware.api [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Waiting for the task: (returnval){ [ 809.340127] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]522da7b1-9daa-1743-56e5-99809a04db25" [ 809.340127] env[60548]: _type = "Task" [ 809.340127] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 809.341259] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 46737200-2da8-41ee-b33e-3bb6cc3e4618 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 809.351450] env[60548]: DEBUG oslo_vmware.api [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]522da7b1-9daa-1743-56e5-99809a04db25, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 809.354598] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 67cbaf5c-e743-4e07-8f74-c51e4f57914d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 809.365666] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 809.367820] env[60548]: DEBUG nova.network.neutron [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 809.375910] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance a878722d-7e36-4f15-8c5f-bd473375dd9b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 809.386239] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance deffd52b-d708-4c46-a168-18e80b05b133 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 809.396847] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance bf1694f2-6ad0-4e15-b05d-c73c24e0e955 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 809.407311] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance ecc4262d-6133-4541-aec8-fbee05180701 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 809.420063] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance c1b6b578-1bed-4e6c-8e7a-34c2e469cd80 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 809.431068] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance fd3e6440-74fc-4425-9b7e-571245ddc379 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 809.441933] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance fbf4b0bc-d87a-4063-87ef-e44f72ecbf5c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 809.447661] env[60548]: DEBUG nova.network.neutron [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 809.453393] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 81c71aa0-9c68-407a-9bef-708c2cb70b12 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 809.455840] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Releasing lock "refresh_cache-b2a1b0c6-91cd-437d-b5bf-ca618b008c23" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 809.456298] env[60548]: DEBUG nova.compute.manager [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 809.456526] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 809.457577] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90b6cb59-2624-407e-8c27-019e5bfd2664 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.463469] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance bdae41ee-e9c3-4272-9623-ca88464ec45a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 809.466822] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 809.467238] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-37104081-7c31-465c-a1eb-5df07f3b7d2b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.474154] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 3a0515f1-e61a-48d4-980d-49c7189dca2d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 809.485993] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 30cf201d-7a1c-479c-9040-fba38726d9ab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 809.486268] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 809.486416] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=100GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 809.510230] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 809.510467] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 809.510659] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Deleting the datastore file [datastore1] b2a1b0c6-91cd-437d-b5bf-ca618b008c23 {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 809.510929] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b6c650b5-b620-43af-8ddd-474bdc72d627 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.518878] env[60548]: DEBUG oslo_vmware.api [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Waiting for the task: (returnval){ [ 809.518878] env[60548]: value = "task-4323357" [ 809.518878] env[60548]: _type = "Task" [ 809.518878] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 809.530548] env[60548]: DEBUG oslo_vmware.api [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Task: {'id': task-4323357, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 809.798019] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec3bb36f-77f2-4a3b-a329-5e0ec1af9508 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.807573] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec732f5c-9bfb-42ea-9b22-47bbb3bed308 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.838836] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-605add9c-eeb1-420f-a90c-ceee8d8a01e3 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.852678] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45aac4f3-ea92-4967-918f-1d54c9c56e00 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.856309] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 809.856569] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Creating directory with path [datastore1] vmware_temp/2387ac0b-3d1e-4fd4-a2cf-4af7c614f173/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 809.857097] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-10c20813-86a1-4e5e-9fd1-114aed59e6d7 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.866757] env[60548]: DEBUG nova.compute.provider_tree [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 809.870118] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Created directory with path [datastore1] vmware_temp/2387ac0b-3d1e-4fd4-a2cf-4af7c614f173/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 809.870291] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Fetch image to [datastore1] vmware_temp/2387ac0b-3d1e-4fd4-a2cf-4af7c614f173/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 809.870425] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/2387ac0b-3d1e-4fd4-a2cf-4af7c614f173/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 809.871177] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-810c2d4d-ec16-42b4-9e25-e004a86e9837 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.875665] env[60548]: DEBUG nova.scheduler.client.report [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 809.881401] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec31248b-e540-4c41-a143-8bd583014ee8 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.891474] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60548) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 809.891678] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.641s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 809.893041] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c908af4-062c-4552-8118-836a8399a437 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.927061] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27c88089-ee92-4428-a65e-1975431a4299 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.933832] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5e51f598-4c79-4f97-a178-596d01d5a940 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 809.957471] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 810.005459] env[60548]: DEBUG oslo_vmware.rw_handles [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2387ac0b-3d1e-4fd4-a2cf-4af7c614f173/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 810.063980] env[60548]: DEBUG oslo_vmware.rw_handles [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Completed reading data from the image iterator. {{(pid=60548) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 810.064189] env[60548]: DEBUG oslo_vmware.rw_handles [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2387ac0b-3d1e-4fd4-a2cf-4af7c614f173/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 810.069576] env[60548]: DEBUG oslo_vmware.api [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Task: {'id': task-4323357, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.037239} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 810.069859] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 810.070084] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 810.070263] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 810.070456] env[60548]: INFO nova.compute.manager [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Took 0.61 seconds to destroy the instance on the hypervisor. [ 810.070857] env[60548]: DEBUG oslo.service.loopingcall [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 810.070857] env[60548]: DEBUG nova.compute.manager [-] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Skipping network deallocation for instance since networking was not requested. {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 810.072995] env[60548]: DEBUG nova.compute.claims [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 810.073224] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 810.073379] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 810.509649] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a269b609-73ab-4b57-8922-1bec9c52850e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.518352] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-145c86d7-e552-4845-98b8-57b73831aee3 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.548657] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b333952-7089-43a6-b709-704abebd8dfa {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.556627] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a7fd8a0-3d65-4c70-b90b-d081dded98ec {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.571234] env[60548]: DEBUG nova.compute.provider_tree [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 810.580383] env[60548]: DEBUG nova.scheduler.client.report [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 810.595020] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.521s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 810.595604] env[60548]: ERROR nova.compute.manager [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 810.595604] env[60548]: Faults: ['InvalidArgument'] [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Traceback (most recent call last): [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] self.driver.spawn(context, instance, image_meta, [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] self._vmops.spawn(context, instance, image_meta, injected_files, [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] self._fetch_image_if_missing(context, vi) [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] image_cache(vi, tmp_image_ds_loc) [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] vm_util.copy_virtual_disk( [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] session._wait_for_task(vmdk_copy_task) [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] return self.wait_for_task(task_ref) [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] return evt.wait() [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] result = hub.switch() [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] return self.greenlet.switch() [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] self.f(*self.args, **self.kw) [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] raise exceptions.translate_fault(task_info.error) [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Faults: ['InvalidArgument'] [ 810.595604] env[60548]: ERROR nova.compute.manager [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] [ 810.596806] env[60548]: DEBUG nova.compute.utils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] VimFaultException {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 810.598227] env[60548]: DEBUG nova.compute.manager [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Build of instance b2a1b0c6-91cd-437d-b5bf-ca618b008c23 was re-scheduled: A specified parameter was not correct: fileType [ 810.598227] env[60548]: Faults: ['InvalidArgument'] {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 810.598670] env[60548]: DEBUG nova.compute.manager [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 810.598944] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Acquiring lock "refresh_cache-b2a1b0c6-91cd-437d-b5bf-ca618b008c23" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 810.599119] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Acquired lock "refresh_cache-b2a1b0c6-91cd-437d-b5bf-ca618b008c23" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 810.599260] env[60548]: DEBUG nova.network.neutron [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 810.627457] env[60548]: DEBUG nova.network.neutron [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 810.696179] env[60548]: DEBUG nova.network.neutron [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 810.706163] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Releasing lock "refresh_cache-b2a1b0c6-91cd-437d-b5bf-ca618b008c23" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 810.706408] env[60548]: DEBUG nova.compute.manager [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 810.706628] env[60548]: DEBUG nova.compute.manager [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Skipping network deallocation for instance since networking was not requested. {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 810.792120] env[60548]: INFO nova.scheduler.client.report [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Deleted allocations for instance b2a1b0c6-91cd-437d-b5bf-ca618b008c23 [ 810.808018] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e2551337-bf2e-49c0-bb2a-4d923160bab9 tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Lock "b2a1b0c6-91cd-437d-b5bf-ca618b008c23" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 244.122s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 810.808018] env[60548]: DEBUG oslo_concurrency.lockutils [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Lock "b2a1b0c6-91cd-437d-b5bf-ca618b008c23" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 45.631s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 810.808018] env[60548]: DEBUG oslo_concurrency.lockutils [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Acquiring lock "b2a1b0c6-91cd-437d-b5bf-ca618b008c23-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 810.808018] env[60548]: DEBUG oslo_concurrency.lockutils [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Lock "b2a1b0c6-91cd-437d-b5bf-ca618b008c23-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 810.808018] env[60548]: DEBUG oslo_concurrency.lockutils [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Lock "b2a1b0c6-91cd-437d-b5bf-ca618b008c23-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 810.811417] env[60548]: INFO nova.compute.manager [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Terminating instance [ 810.813655] env[60548]: DEBUG oslo_concurrency.lockutils [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Acquiring lock "refresh_cache-b2a1b0c6-91cd-437d-b5bf-ca618b008c23" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 810.814896] env[60548]: DEBUG oslo_concurrency.lockutils [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Acquired lock "refresh_cache-b2a1b0c6-91cd-437d-b5bf-ca618b008c23" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 810.814896] env[60548]: DEBUG nova.network.neutron [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 810.835190] env[60548]: DEBUG nova.compute.manager [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 810.839503] env[60548]: DEBUG nova.network.neutron [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 810.892288] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 810.892531] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 810.894072] env[60548]: INFO nova.compute.claims [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 810.896704] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 810.961470] env[60548]: DEBUG nova.scheduler.client.report [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Refreshing inventories for resource provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 810.976832] env[60548]: DEBUG nova.scheduler.client.report [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Updating ProviderTree inventory for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 810.977083] env[60548]: DEBUG nova.compute.provider_tree [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Updating inventory in ProviderTree for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 810.990215] env[60548]: DEBUG nova.scheduler.client.report [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Refreshing aggregate associations for resource provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64, aggregates: None {{(pid=60548) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 811.007820] env[60548]: DEBUG nova.scheduler.client.report [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Refreshing trait associations for resource provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=60548) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 811.167113] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 811.171221] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 811.171372] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Starting heal instance info cache {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 811.171489] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Rebuilding the list of instances to heal {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 811.187997] env[60548]: DEBUG nova.network.neutron [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 811.195376] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 811.195376] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 811.195376] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 811.195376] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 811.195376] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 811.195376] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 811.195376] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 811.195376] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 811.195376] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 811.195376] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 811.195376] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Didn't find any instances for network info cache update. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 811.197401] env[60548]: DEBUG oslo_concurrency.lockutils [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Releasing lock "refresh_cache-b2a1b0c6-91cd-437d-b5bf-ca618b008c23" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 811.197504] env[60548]: DEBUG nova.compute.manager [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 811.197681] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 811.198096] env[60548]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-43292980-f6d3-4142-9a7c-7c8bdd0b2d0c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.209966] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45b4cf69-abac-4505-8089-1ea0e2fb6af3 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.245239] env[60548]: WARNING nova.virt.vmwareapi.vmops [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b2a1b0c6-91cd-437d-b5bf-ca618b008c23 could not be found. [ 811.245454] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 811.245644] env[60548]: INFO nova.compute.manager [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Took 0.05 seconds to destroy the instance on the hypervisor. [ 811.245933] env[60548]: DEBUG oslo.service.loopingcall [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 811.248596] env[60548]: DEBUG nova.compute.manager [-] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 811.248705] env[60548]: DEBUG nova.network.neutron [-] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 811.268752] env[60548]: DEBUG nova.network.neutron [-] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 811.276932] env[60548]: DEBUG nova.network.neutron [-] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 811.285290] env[60548]: INFO nova.compute.manager [-] [instance: b2a1b0c6-91cd-437d-b5bf-ca618b008c23] Took 0.04 seconds to deallocate network for instance. [ 811.348231] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bba701ac-656b-44c1-9e12-ab0e19d73943 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.357138] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58cb7a2a-78fa-4356-86bc-4ba226ef1bf9 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.394401] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20905c29-9903-4061-b1bf-05188a547dbb {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.403077] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39e88496-bd87-47a9-8214-e2ba8bb26513 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.417499] env[60548]: DEBUG nova.compute.provider_tree [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 811.426925] env[60548]: DEBUG nova.scheduler.client.report [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 811.432789] env[60548]: DEBUG oslo_concurrency.lockutils [None req-594f6729-2bbd-4856-b435-f7ee6534c2cf tempest-ServersAdmin275Test-568000092 tempest-ServersAdmin275Test-568000092-project-member] Lock "b2a1b0c6-91cd-437d-b5bf-ca618b008c23" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.626s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 811.446373] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.554s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 811.447781] env[60548]: DEBUG nova.compute.manager [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 811.481538] env[60548]: DEBUG nova.compute.utils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 811.483583] env[60548]: DEBUG nova.compute.manager [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 811.483583] env[60548]: DEBUG nova.network.neutron [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 811.497324] env[60548]: DEBUG nova.compute.manager [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 811.565249] env[60548]: DEBUG nova.compute.manager [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 811.587256] env[60548]: DEBUG nova.policy [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd6b334eabb2a43d6b5c76c16a022591e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '75b9fd168b9c45a8a4d2f7508844570f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 811.589786] env[60548]: DEBUG nova.virt.hardware [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 811.589985] env[60548]: DEBUG nova.virt.hardware [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 811.590192] env[60548]: DEBUG nova.virt.hardware [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 811.590398] env[60548]: DEBUG nova.virt.hardware [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 811.590579] env[60548]: DEBUG nova.virt.hardware [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 811.590746] env[60548]: DEBUG nova.virt.hardware [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 811.590977] env[60548]: DEBUG nova.virt.hardware [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 811.591198] env[60548]: DEBUG nova.virt.hardware [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 811.591403] env[60548]: DEBUG nova.virt.hardware [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 811.591708] env[60548]: DEBUG nova.virt.hardware [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 811.591794] env[60548]: DEBUG nova.virt.hardware [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 811.592663] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d6c8128-2ab7-46e9-9d73-4392674bf450 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 811.601919] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fbde7d4-cfc8-4afd-86c4-6c6ae7506825 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 812.171404] env[60548]: DEBUG nova.network.neutron [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Successfully created port: 1dea67b3-43ea-46e2-837b-2df67997648f {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 812.173538] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 813.587199] env[60548]: DEBUG nova.network.neutron [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Successfully updated port: 1dea67b3-43ea-46e2-837b-2df67997648f {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 813.597126] env[60548]: DEBUG nova.compute.manager [req-7b731393-ce89-4f22-b1fa-f25411a53c05 req-ca16919b-f7b7-45c8-a6eb-576f03519070 service nova] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Received event network-vif-plugged-1dea67b3-43ea-46e2-837b-2df67997648f {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 813.597432] env[60548]: DEBUG oslo_concurrency.lockutils [req-7b731393-ce89-4f22-b1fa-f25411a53c05 req-ca16919b-f7b7-45c8-a6eb-576f03519070 service nova] Acquiring lock "46737200-2da8-41ee-b33e-3bb6cc3e4618-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 813.597615] env[60548]: DEBUG oslo_concurrency.lockutils [req-7b731393-ce89-4f22-b1fa-f25411a53c05 req-ca16919b-f7b7-45c8-a6eb-576f03519070 service nova] Lock "46737200-2da8-41ee-b33e-3bb6cc3e4618-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 813.597784] env[60548]: DEBUG oslo_concurrency.lockutils [req-7b731393-ce89-4f22-b1fa-f25411a53c05 req-ca16919b-f7b7-45c8-a6eb-576f03519070 service nova] Lock "46737200-2da8-41ee-b33e-3bb6cc3e4618-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 813.597946] env[60548]: DEBUG nova.compute.manager [req-7b731393-ce89-4f22-b1fa-f25411a53c05 req-ca16919b-f7b7-45c8-a6eb-576f03519070 service nova] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] No waiting events found dispatching network-vif-plugged-1dea67b3-43ea-46e2-837b-2df67997648f {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 813.598121] env[60548]: WARNING nova.compute.manager [req-7b731393-ce89-4f22-b1fa-f25411a53c05 req-ca16919b-f7b7-45c8-a6eb-576f03519070 service nova] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Received unexpected event network-vif-plugged-1dea67b3-43ea-46e2-837b-2df67997648f for instance with vm_state building and task_state spawning. [ 813.599647] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Acquiring lock "refresh_cache-46737200-2da8-41ee-b33e-3bb6cc3e4618" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 813.599784] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Acquired lock "refresh_cache-46737200-2da8-41ee-b33e-3bb6cc3e4618" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 813.599937] env[60548]: DEBUG nova.network.neutron [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 813.681732] env[60548]: DEBUG nova.network.neutron [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 814.053048] env[60548]: DEBUG nova.network.neutron [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Updating instance_info_cache with network_info: [{"id": "1dea67b3-43ea-46e2-837b-2df67997648f", "address": "fa:16:3e:ed:5d:cc", "network": {"id": "995fbca0-92a5-4d12-bcd5-ff02d4a7cedd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2120407924-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "75b9fd168b9c45a8a4d2f7508844570f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e044cfd4-1b0d-4d88-b1bd-604025731d3f", "external-id": "nsx-vlan-transportzone-372", "segmentation_id": 372, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1dea67b3-43", "ovs_interfaceid": "1dea67b3-43ea-46e2-837b-2df67997648f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 814.067680] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Releasing lock "refresh_cache-46737200-2da8-41ee-b33e-3bb6cc3e4618" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 814.068065] env[60548]: DEBUG nova.compute.manager [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Instance network_info: |[{"id": "1dea67b3-43ea-46e2-837b-2df67997648f", "address": "fa:16:3e:ed:5d:cc", "network": {"id": "995fbca0-92a5-4d12-bcd5-ff02d4a7cedd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2120407924-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "75b9fd168b9c45a8a4d2f7508844570f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e044cfd4-1b0d-4d88-b1bd-604025731d3f", "external-id": "nsx-vlan-transportzone-372", "segmentation_id": 372, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1dea67b3-43", "ovs_interfaceid": "1dea67b3-43ea-46e2-837b-2df67997648f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 814.068442] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ed:5d:cc', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e044cfd4-1b0d-4d88-b1bd-604025731d3f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1dea67b3-43ea-46e2-837b-2df67997648f', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 814.076784] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Creating folder: Project (75b9fd168b9c45a8a4d2f7508844570f). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 814.077299] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c8f01b2b-cbde-41d8-947d-e8b62965e086 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 814.087616] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Created folder: Project (75b9fd168b9c45a8a4d2f7508844570f) in parent group-v850287. [ 814.087851] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Creating folder: Instances. Parent ref: group-v850339. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 814.088117] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-21819d02-dea1-474e-a26f-e31ccd7f37be {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 814.097184] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Created folder: Instances in parent group-v850339. [ 814.097453] env[60548]: DEBUG oslo.service.loopingcall [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 814.097669] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 814.097914] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f1d9f062-c054-4e84-9460-6b4f872e7921 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 814.120355] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 814.120355] env[60548]: value = "task-4323360" [ 814.120355] env[60548]: _type = "Task" [ 814.120355] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 814.129186] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323360, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 814.630843] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323360, 'name': CreateVM_Task, 'duration_secs': 0.330047} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 814.631205] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 814.632227] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 814.632440] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 814.632826] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 814.633094] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5fbfc518-b931-4965-a41b-06ae4676c0cd {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 814.638577] env[60548]: DEBUG oslo_vmware.api [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Waiting for the task: (returnval){ [ 814.638577] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]522fcd1b-619f-7ef5-ffea-4caf46641b8f" [ 814.638577] env[60548]: _type = "Task" [ 814.638577] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 814.647607] env[60548]: DEBUG oslo_vmware.api [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]522fcd1b-619f-7ef5-ffea-4caf46641b8f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 815.151305] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 815.151670] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 815.151961] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 815.661954] env[60548]: DEBUG nova.compute.manager [req-09c0be9b-6aad-488d-a0cf-b831e8487e26 req-98546b35-3209-49ac-adba-32dd6df198d9 service nova] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Received event network-changed-1dea67b3-43ea-46e2-837b-2df67997648f {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 815.662276] env[60548]: DEBUG nova.compute.manager [req-09c0be9b-6aad-488d-a0cf-b831e8487e26 req-98546b35-3209-49ac-adba-32dd6df198d9 service nova] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Refreshing instance network info cache due to event network-changed-1dea67b3-43ea-46e2-837b-2df67997648f. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 815.662412] env[60548]: DEBUG oslo_concurrency.lockutils [req-09c0be9b-6aad-488d-a0cf-b831e8487e26 req-98546b35-3209-49ac-adba-32dd6df198d9 service nova] Acquiring lock "refresh_cache-46737200-2da8-41ee-b33e-3bb6cc3e4618" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 815.662559] env[60548]: DEBUG oslo_concurrency.lockutils [req-09c0be9b-6aad-488d-a0cf-b831e8487e26 req-98546b35-3209-49ac-adba-32dd6df198d9 service nova] Acquired lock "refresh_cache-46737200-2da8-41ee-b33e-3bb6cc3e4618" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 815.662742] env[60548]: DEBUG nova.network.neutron [req-09c0be9b-6aad-488d-a0cf-b831e8487e26 req-98546b35-3209-49ac-adba-32dd6df198d9 service nova] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Refreshing network info cache for port 1dea67b3-43ea-46e2-837b-2df67997648f {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 816.094899] env[60548]: DEBUG nova.network.neutron [req-09c0be9b-6aad-488d-a0cf-b831e8487e26 req-98546b35-3209-49ac-adba-32dd6df198d9 service nova] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Updated VIF entry in instance network info cache for port 1dea67b3-43ea-46e2-837b-2df67997648f. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 816.095253] env[60548]: DEBUG nova.network.neutron [req-09c0be9b-6aad-488d-a0cf-b831e8487e26 req-98546b35-3209-49ac-adba-32dd6df198d9 service nova] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Updating instance_info_cache with network_info: [{"id": "1dea67b3-43ea-46e2-837b-2df67997648f", "address": "fa:16:3e:ed:5d:cc", "network": {"id": "995fbca0-92a5-4d12-bcd5-ff02d4a7cedd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2120407924-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "75b9fd168b9c45a8a4d2f7508844570f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e044cfd4-1b0d-4d88-b1bd-604025731d3f", "external-id": "nsx-vlan-transportzone-372", "segmentation_id": 372, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1dea67b3-43", "ovs_interfaceid": "1dea67b3-43ea-46e2-837b-2df67997648f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 816.109145] env[60548]: DEBUG oslo_concurrency.lockutils [req-09c0be9b-6aad-488d-a0cf-b831e8487e26 req-98546b35-3209-49ac-adba-32dd6df198d9 service nova] Releasing lock "refresh_cache-46737200-2da8-41ee-b33e-3bb6cc3e4618" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 835.189318] env[60548]: DEBUG nova.compute.manager [req-94b3a449-e2be-45dd-bf73-e263b50e9dce req-1bdc8282-7a17-4668-91d7-3dc89964351e service nova] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Received event network-vif-deleted-87172cab-7cca-4f9e-b9c3-3850418db9e5 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 838.078489] env[60548]: DEBUG nova.compute.manager [req-2fd4e9b0-0a74-46d4-944e-c9ea57d975e0 req-039fa888-6719-48e7-8131-32899b4c8333 service nova] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Received event network-vif-deleted-2c311e52-2180-42f1-b2a9-c1f5cc33a26c {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 846.635661] env[60548]: DEBUG nova.compute.manager [req-c834a9a3-00c4-48b6-a353-682c91a6f6fd req-d5e55925-e4af-44eb-9f2d-e70b9753b655 service nova] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Received event network-vif-deleted-b25225e3-3b7a-4efb-b82f-d2bbd02040ff {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 853.137522] env[60548]: DEBUG nova.compute.manager [req-8bb14c68-00a4-4015-9360-fd340ad4b831 req-91628325-5f9d-4cae-9502-10f9bd945f55 service nova] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Received event network-vif-deleted-a2c5ed22-d705-4425-b72f-36792e06f5e7 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 856.374178] env[60548]: DEBUG nova.compute.manager [req-c7fbde3f-7aae-427e-995a-26186e07d4cd req-d924bd5a-28ba-4096-b6ea-61e96ffcb654 service nova] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Received event network-vif-deleted-fa3ddc74-9aa4-416d-b6c8-299e875a9e8d {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 857.039168] env[60548]: WARNING oslo_vmware.rw_handles [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 857.039168] env[60548]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 857.039168] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 857.039168] env[60548]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 857.039168] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 857.039168] env[60548]: ERROR oslo_vmware.rw_handles response.begin() [ 857.039168] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 857.039168] env[60548]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 857.039168] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 857.039168] env[60548]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 857.039168] env[60548]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 857.039168] env[60548]: ERROR oslo_vmware.rw_handles [ 857.039941] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Downloaded image file data 5674e50f-0c0c-4f19-8379-104dac34660b to vmware_temp/2387ac0b-3d1e-4fd4-a2cf-4af7c614f173/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 857.041888] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Caching image {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 857.043971] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Copying Virtual Disk [datastore1] vmware_temp/2387ac0b-3d1e-4fd4-a2cf-4af7c614f173/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk to [datastore1] vmware_temp/2387ac0b-3d1e-4fd4-a2cf-4af7c614f173/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk {{(pid=60548) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 857.044344] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-cf92db7c-4cc2-4253-862b-7ac3ff58c1b4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 857.056377] env[60548]: DEBUG oslo_vmware.api [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Waiting for the task: (returnval){ [ 857.056377] env[60548]: value = "task-4323361" [ 857.056377] env[60548]: _type = "Task" [ 857.056377] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 857.065230] env[60548]: DEBUG oslo_vmware.api [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Task: {'id': task-4323361, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 857.570581] env[60548]: DEBUG oslo_vmware.exceptions [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Fault InvalidArgument not matched. {{(pid=60548) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 857.570581] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 857.570581] env[60548]: ERROR nova.compute.manager [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 857.570581] env[60548]: Faults: ['InvalidArgument'] [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Traceback (most recent call last): [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] yield resources [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] self.driver.spawn(context, instance, image_meta, [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] self._fetch_image_if_missing(context, vi) [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] image_cache(vi, tmp_image_ds_loc) [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] vm_util.copy_virtual_disk( [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] session._wait_for_task(vmdk_copy_task) [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] return self.wait_for_task(task_ref) [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] return evt.wait() [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] result = hub.switch() [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] return self.greenlet.switch() [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] self.f(*self.args, **self.kw) [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] raise exceptions.translate_fault(task_info.error) [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Faults: ['InvalidArgument'] [ 857.570581] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] [ 857.570581] env[60548]: INFO nova.compute.manager [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Terminating instance [ 857.573265] env[60548]: DEBUG nova.compute.manager [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 857.573614] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 857.573910] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 857.574373] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 857.575214] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d97c17ec-8323-4210-98a6-c0e245fb9d8a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 857.578148] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8136193a-626c-411c-84aa-a84c3797d03f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 857.585204] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 857.585462] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ef11c39f-4767-48ed-b0a3-82020bf2ce23 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 857.588496] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 857.588768] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 857.589855] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-43db59e2-a9e4-4052-81d3-79548ae6b7e2 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 857.595098] env[60548]: DEBUG oslo_vmware.api [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Waiting for the task: (returnval){ [ 857.595098] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52bd38b9-374f-2acc-3a69-1555839f65fe" [ 857.595098] env[60548]: _type = "Task" [ 857.595098] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 857.611218] env[60548]: DEBUG oslo_vmware.api [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52bd38b9-374f-2acc-3a69-1555839f65fe, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 857.670243] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 857.670407] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 857.670532] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Deleting the datastore file [datastore1] 9a14b9d0-876b-45c6-825e-103caac6bef9 {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 857.670871] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-86056554-1361-4b63-92ed-3f2bb95a12d4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 857.678812] env[60548]: DEBUG oslo_vmware.api [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Waiting for the task: (returnval){ [ 857.678812] env[60548]: value = "task-4323363" [ 857.678812] env[60548]: _type = "Task" [ 857.678812] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 857.691099] env[60548]: DEBUG oslo_vmware.api [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Task: {'id': task-4323363, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 858.108852] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 858.108852] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Creating directory with path [datastore1] vmware_temp/6930019e-1fbc-46a1-829c-3c0410ee5da2/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 858.108852] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4f4a1a83-e3ef-42e5-b480-a211e05c485f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 858.122181] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Created directory with path [datastore1] vmware_temp/6930019e-1fbc-46a1-829c-3c0410ee5da2/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 858.122181] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Fetch image to [datastore1] vmware_temp/6930019e-1fbc-46a1-829c-3c0410ee5da2/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 858.122181] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/6930019e-1fbc-46a1-829c-3c0410ee5da2/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 858.122181] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15a3a850-fc10-4223-b59c-5f68c575e0cb {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 858.135099] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d361fced-82d5-4616-8cb2-687dfd71c578 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 858.146634] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cedd405d-a681-439a-9ee1-f34440811d2f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 858.185850] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fa7aaaf-9131-4913-8fbf-fc5f52876328 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 858.195966] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-08b37845-c384-4978-8372-625f7e399a1a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 858.198065] env[60548]: DEBUG oslo_vmware.api [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Task: {'id': task-4323363, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.08505} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 858.198375] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 858.198606] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 858.198826] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 858.199089] env[60548]: INFO nova.compute.manager [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Took 0.63 seconds to destroy the instance on the hypervisor. [ 858.201474] env[60548]: DEBUG nova.compute.claims [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 858.201702] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 858.201969] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 858.222591] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 858.355349] env[60548]: DEBUG oslo_vmware.rw_handles [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6930019e-1fbc-46a1-829c-3c0410ee5da2/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 858.418415] env[60548]: DEBUG oslo_vmware.rw_handles [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Completed reading data from the image iterator. {{(pid=60548) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 858.418536] env[60548]: DEBUG oslo_vmware.rw_handles [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6930019e-1fbc-46a1-829c-3c0410ee5da2/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 858.539966] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9eab8749-a3c3-4409-ae04-f8efe7519794 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 858.550032] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2c1142a-d20d-49ef-a8f0-e8c09655e7af {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 858.588671] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edd4e025-d1a3-4ac3-962a-6294c6df2ca8 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 858.596918] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5cea96b9-0710-46f9-94a9-f3d89ffc01eb {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 858.612104] env[60548]: DEBUG nova.compute.provider_tree [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 858.621494] env[60548]: DEBUG nova.scheduler.client.report [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 858.636558] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.434s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 858.637178] env[60548]: ERROR nova.compute.manager [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 858.637178] env[60548]: Faults: ['InvalidArgument'] [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Traceback (most recent call last): [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] self.driver.spawn(context, instance, image_meta, [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] self._fetch_image_if_missing(context, vi) [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] image_cache(vi, tmp_image_ds_loc) [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] vm_util.copy_virtual_disk( [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] session._wait_for_task(vmdk_copy_task) [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] return self.wait_for_task(task_ref) [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] return evt.wait() [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] result = hub.switch() [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] return self.greenlet.switch() [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] self.f(*self.args, **self.kw) [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] raise exceptions.translate_fault(task_info.error) [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Faults: ['InvalidArgument'] [ 858.637178] env[60548]: ERROR nova.compute.manager [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] [ 858.638229] env[60548]: DEBUG nova.compute.utils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] VimFaultException {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 858.639597] env[60548]: DEBUG nova.compute.manager [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Build of instance 9a14b9d0-876b-45c6-825e-103caac6bef9 was re-scheduled: A specified parameter was not correct: fileType [ 858.639597] env[60548]: Faults: ['InvalidArgument'] {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 858.639994] env[60548]: DEBUG nova.compute.manager [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 858.640200] env[60548]: DEBUG nova.compute.manager [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 858.640369] env[60548]: DEBUG nova.compute.manager [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 858.640531] env[60548]: DEBUG nova.network.neutron [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 858.731193] env[60548]: DEBUG nova.compute.manager [req-49b9184a-0cfe-4f02-9390-235364448903 req-7409c171-4635-4bed-8f32-717210efffc3 service nova] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Received event network-vif-deleted-1dea67b3-43ea-46e2-837b-2df67997648f {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 858.731407] env[60548]: DEBUG nova.compute.manager [req-49b9184a-0cfe-4f02-9390-235364448903 req-7409c171-4635-4bed-8f32-717210efffc3 service nova] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Received event network-vif-deleted-7696794d-4c5d-406f-b4e5-b0e3e34a7019 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 859.182342] env[60548]: DEBUG nova.network.neutron [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 859.203268] env[60548]: INFO nova.compute.manager [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Took 0.56 seconds to deallocate network for instance. [ 859.319462] env[60548]: INFO nova.scheduler.client.report [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Deleted allocations for instance 9a14b9d0-876b-45c6-825e-103caac6bef9 [ 859.342542] env[60548]: DEBUG oslo_concurrency.lockutils [None req-24d3a34a-be7e-4c97-8991-2487c305083e tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Lock "9a14b9d0-876b-45c6-825e-103caac6bef9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 293.550s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 859.344320] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d54fbeb9-98d7-4ea2-ac90-398d9ffb9b8c tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Lock "9a14b9d0-876b-45c6-825e-103caac6bef9" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 96.624s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 859.344632] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d54fbeb9-98d7-4ea2-ac90-398d9ffb9b8c tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Acquiring lock "9a14b9d0-876b-45c6-825e-103caac6bef9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 859.344857] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d54fbeb9-98d7-4ea2-ac90-398d9ffb9b8c tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Lock "9a14b9d0-876b-45c6-825e-103caac6bef9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 859.345112] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d54fbeb9-98d7-4ea2-ac90-398d9ffb9b8c tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Lock "9a14b9d0-876b-45c6-825e-103caac6bef9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 859.347183] env[60548]: INFO nova.compute.manager [None req-d54fbeb9-98d7-4ea2-ac90-398d9ffb9b8c tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Terminating instance [ 859.350148] env[60548]: DEBUG nova.compute.manager [None req-d54fbeb9-98d7-4ea2-ac90-398d9ffb9b8c tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 859.350361] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d54fbeb9-98d7-4ea2-ac90-398d9ffb9b8c tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 859.350608] env[60548]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5f712f35-7fa4-40b9-ba80-119612bafb3d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.361154] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1380ab6d-8152-4032-83e5-c66aee5e3f95 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.372382] env[60548]: DEBUG nova.compute.manager [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 859.400053] env[60548]: WARNING nova.virt.vmwareapi.vmops [None req-d54fbeb9-98d7-4ea2-ac90-398d9ffb9b8c tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9a14b9d0-876b-45c6-825e-103caac6bef9 could not be found. [ 859.400284] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d54fbeb9-98d7-4ea2-ac90-398d9ffb9b8c tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 859.400459] env[60548]: INFO nova.compute.manager [None req-d54fbeb9-98d7-4ea2-ac90-398d9ffb9b8c tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Took 0.05 seconds to destroy the instance on the hypervisor. [ 859.400697] env[60548]: DEBUG oslo.service.loopingcall [None req-d54fbeb9-98d7-4ea2-ac90-398d9ffb9b8c tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 859.400939] env[60548]: DEBUG nova.compute.manager [-] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 859.401062] env[60548]: DEBUG nova.network.neutron [-] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 859.430892] env[60548]: DEBUG oslo_concurrency.lockutils [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 859.431144] env[60548]: DEBUG oslo_concurrency.lockutils [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 859.432619] env[60548]: INFO nova.compute.claims [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 859.436414] env[60548]: DEBUG nova.network.neutron [-] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 859.444161] env[60548]: INFO nova.compute.manager [-] [instance: 9a14b9d0-876b-45c6-825e-103caac6bef9] Took 0.04 seconds to deallocate network for instance. [ 859.550054] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d54fbeb9-98d7-4ea2-ac90-398d9ffb9b8c tempest-ImagesTestJSON-104737604 tempest-ImagesTestJSON-104737604-project-member] Lock "9a14b9d0-876b-45c6-825e-103caac6bef9" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.206s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 859.739782] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-543d749c-c237-4723-8714-8c81b863de4a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.747603] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63341691-880f-4948-b890-19749ad0a615 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.777855] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1afdb3fc-3bbb-428f-aa8e-a54feaad2f84 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.785961] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35beb8d3-3f77-4d9e-b1a0-09d5edad3ab8 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.800761] env[60548]: DEBUG nova.compute.provider_tree [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 859.809269] env[60548]: DEBUG nova.scheduler.client.report [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 859.822856] env[60548]: DEBUG oslo_concurrency.lockutils [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.391s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 859.838119] env[60548]: DEBUG oslo_concurrency.lockutils [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Acquiring lock "dc54ec98-9fb0-4e99-a38c-dff83f4c3e0c" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 859.838375] env[60548]: DEBUG oslo_concurrency.lockutils [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Lock "dc54ec98-9fb0-4e99-a38c-dff83f4c3e0c" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 859.843639] env[60548]: DEBUG oslo_concurrency.lockutils [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Lock "dc54ec98-9fb0-4e99-a38c-dff83f4c3e0c" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: held 0.005s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 859.844175] env[60548]: DEBUG nova.compute.manager [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 859.873405] env[60548]: DEBUG nova.compute.utils [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 859.874921] env[60548]: DEBUG nova.compute.manager [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 859.874921] env[60548]: DEBUG nova.network.neutron [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 859.883797] env[60548]: DEBUG nova.compute.manager [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 859.961454] env[60548]: DEBUG nova.policy [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '887300ba477a4a609b203656399d428e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e1b61c42902e4ab48c30c1e4b65dc169', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 860.483202] env[60548]: DEBUG nova.network.neutron [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Successfully created port: 3d4a9268-2d6e-49e4-a0ba-464aefeef366 {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 861.780034] env[60548]: DEBUG nova.network.neutron [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Successfully updated port: 3d4a9268-2d6e-49e4-a0ba-464aefeef366 {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 861.786193] env[60548]: DEBUG oslo_concurrency.lockutils [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Acquiring lock "refresh_cache-67cbaf5c-e743-4e07-8f74-c51e4f57914d" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 861.787432] env[60548]: DEBUG oslo_concurrency.lockutils [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Acquired lock "refresh_cache-67cbaf5c-e743-4e07-8f74-c51e4f57914d" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 861.788767] env[60548]: DEBUG nova.network.neutron [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 861.798584] env[60548]: DEBUG nova.compute.utils [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Can not refresh info_cache because instance was not found {{(pid=60548) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 861.867608] env[60548]: DEBUG nova.network.neutron [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 862.213075] env[60548]: DEBUG nova.network.neutron [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Updating instance_info_cache with network_info: [{"id": "3d4a9268-2d6e-49e4-a0ba-464aefeef366", "address": "fa:16:3e:2c:5e:73", "network": {"id": "5ad9453a-029c-4c69-9e53-ae31a0ded738", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1832943623-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e1b61c42902e4ab48c30c1e4b65dc169", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d19577c9-1b2e-490b-8031-2f278dd3f570", "external-id": "nsx-vlan-transportzone-611", "segmentation_id": 611, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3d4a9268-2d", "ovs_interfaceid": "3d4a9268-2d6e-49e4-a0ba-464aefeef366", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 862.228882] env[60548]: DEBUG oslo_concurrency.lockutils [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Releasing lock "refresh_cache-67cbaf5c-e743-4e07-8f74-c51e4f57914d" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 862.228882] env[60548]: DEBUG nova.compute.manager [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Instance network_info: |[{"id": "3d4a9268-2d6e-49e4-a0ba-464aefeef366", "address": "fa:16:3e:2c:5e:73", "network": {"id": "5ad9453a-029c-4c69-9e53-ae31a0ded738", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-1832943623-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e1b61c42902e4ab48c30c1e4b65dc169", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d19577c9-1b2e-490b-8031-2f278dd3f570", "external-id": "nsx-vlan-transportzone-611", "segmentation_id": 611, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3d4a9268-2d", "ovs_interfaceid": "3d4a9268-2d6e-49e4-a0ba-464aefeef366", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 862.228882] env[60548]: INFO nova.compute.manager [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Terminating instance [ 862.232286] env[60548]: DEBUG nova.compute.manager [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 862.232521] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 862.232796] env[60548]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a4999796-8a6a-4598-b06f-4c9a55c5fc5b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 862.243656] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67f0f148-50ff-4b27-a97c-bec25f236dfc {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 862.278698] env[60548]: WARNING nova.virt.vmwareapi.vmops [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 67cbaf5c-e743-4e07-8f74-c51e4f57914d could not be found. [ 862.278698] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 862.278698] env[60548]: INFO nova.compute.manager [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Took 0.05 seconds to destroy the instance on the hypervisor. [ 862.279868] env[60548]: DEBUG nova.compute.claims [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 862.279868] env[60548]: DEBUG oslo_concurrency.lockutils [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 862.280029] env[60548]: DEBUG oslo_concurrency.lockutils [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 862.321372] env[60548]: DEBUG oslo_concurrency.lockutils [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.041s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 862.322239] env[60548]: DEBUG nova.compute.utils [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Instance 67cbaf5c-e743-4e07-8f74-c51e4f57914d could not be found. {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 862.323828] env[60548]: DEBUG nova.compute.manager [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Instance disappeared during build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 862.324072] env[60548]: DEBUG nova.compute.manager [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 862.324961] env[60548]: DEBUG nova.compute.manager [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 862.324961] env[60548]: DEBUG nova.compute.manager [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 862.324961] env[60548]: DEBUG nova.network.neutron [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 863.201230] env[60548]: DEBUG nova.compute.manager [req-4c868ff7-24cf-4573-b3af-712cb09bec38 req-9558252a-8533-462d-b02d-b729128daaf4 service nova] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Received event network-vif-plugged-3d4a9268-2d6e-49e4-a0ba-464aefeef366 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 863.201230] env[60548]: DEBUG oslo_concurrency.lockutils [req-4c868ff7-24cf-4573-b3af-712cb09bec38 req-9558252a-8533-462d-b02d-b729128daaf4 service nova] Acquiring lock "67cbaf5c-e743-4e07-8f74-c51e4f57914d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 863.201230] env[60548]: DEBUG oslo_concurrency.lockutils [req-4c868ff7-24cf-4573-b3af-712cb09bec38 req-9558252a-8533-462d-b02d-b729128daaf4 service nova] Lock "67cbaf5c-e743-4e07-8f74-c51e4f57914d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 863.201230] env[60548]: DEBUG oslo_concurrency.lockutils [req-4c868ff7-24cf-4573-b3af-712cb09bec38 req-9558252a-8533-462d-b02d-b729128daaf4 service nova] Lock "67cbaf5c-e743-4e07-8f74-c51e4f57914d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 863.201230] env[60548]: DEBUG nova.compute.manager [req-4c868ff7-24cf-4573-b3af-712cb09bec38 req-9558252a-8533-462d-b02d-b729128daaf4 service nova] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] No waiting events found dispatching network-vif-plugged-3d4a9268-2d6e-49e4-a0ba-464aefeef366 {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 863.201230] env[60548]: WARNING nova.compute.manager [req-4c868ff7-24cf-4573-b3af-712cb09bec38 req-9558252a-8533-462d-b02d-b729128daaf4 service nova] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Received unexpected event network-vif-plugged-3d4a9268-2d6e-49e4-a0ba-464aefeef366 for instance with vm_state deleted and task_state None. [ 863.201230] env[60548]: DEBUG nova.compute.manager [req-4c868ff7-24cf-4573-b3af-712cb09bec38 req-9558252a-8533-462d-b02d-b729128daaf4 service nova] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Received event network-changed-3d4a9268-2d6e-49e4-a0ba-464aefeef366 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 863.201230] env[60548]: DEBUG nova.compute.manager [req-4c868ff7-24cf-4573-b3af-712cb09bec38 req-9558252a-8533-462d-b02d-b729128daaf4 service nova] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Refreshing instance network info cache due to event network-changed-3d4a9268-2d6e-49e4-a0ba-464aefeef366. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 863.203114] env[60548]: DEBUG oslo_concurrency.lockutils [req-4c868ff7-24cf-4573-b3af-712cb09bec38 req-9558252a-8533-462d-b02d-b729128daaf4 service nova] Acquiring lock "refresh_cache-67cbaf5c-e743-4e07-8f74-c51e4f57914d" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 863.203418] env[60548]: DEBUG oslo_concurrency.lockutils [req-4c868ff7-24cf-4573-b3af-712cb09bec38 req-9558252a-8533-462d-b02d-b729128daaf4 service nova] Acquired lock "refresh_cache-67cbaf5c-e743-4e07-8f74-c51e4f57914d" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 863.203524] env[60548]: DEBUG nova.network.neutron [req-4c868ff7-24cf-4573-b3af-712cb09bec38 req-9558252a-8533-462d-b02d-b729128daaf4 service nova] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Refreshing network info cache for port 3d4a9268-2d6e-49e4-a0ba-464aefeef366 {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 863.223928] env[60548]: DEBUG nova.network.neutron [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 863.240464] env[60548]: INFO nova.compute.manager [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Took 0.92 seconds to deallocate network for instance. [ 863.257156] env[60548]: DEBUG nova.network.neutron [req-4c868ff7-24cf-4573-b3af-712cb09bec38 req-9558252a-8533-462d-b02d-b729128daaf4 service nova] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 863.287234] env[60548]: DEBUG oslo_concurrency.lockutils [None req-348fc44a-8435-47bd-86c7-d1c4bfe70069 tempest-ServerGroupTestJSON-1519550876 tempest-ServerGroupTestJSON-1519550876-project-member] Lock "67cbaf5c-e743-4e07-8f74-c51e4f57914d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.207s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 863.297982] env[60548]: DEBUG nova.compute.manager [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 863.357781] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 863.362020] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 863.362020] env[60548]: INFO nova.compute.claims [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 863.598012] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2e025b7-76e9-4f09-a7c4-ee7c5d256e8a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.607028] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90b1cb20-0d76-498e-a134-2d943eda3e03 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.639061] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db381ff4-623c-455c-b362-924a928f914e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.647857] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be87369f-0756-4b4b-b7ef-69f48bc2e344 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.666619] env[60548]: DEBUG nova.compute.provider_tree [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 863.672381] env[60548]: DEBUG nova.scheduler.client.report [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 863.695807] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.335s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 863.695807] env[60548]: DEBUG nova.compute.manager [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 863.730576] env[60548]: DEBUG nova.compute.utils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 863.735722] env[60548]: DEBUG nova.compute.manager [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 863.735909] env[60548]: DEBUG nova.network.neutron [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 863.750567] env[60548]: DEBUG nova.compute.manager [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 863.822836] env[60548]: DEBUG nova.compute.manager [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 863.847705] env[60548]: DEBUG nova.virt.hardware [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 863.848115] env[60548]: DEBUG nova.virt.hardware [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 863.848115] env[60548]: DEBUG nova.virt.hardware [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 863.848300] env[60548]: DEBUG nova.virt.hardware [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 863.848415] env[60548]: DEBUG nova.virt.hardware [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 863.848861] env[60548]: DEBUG nova.virt.hardware [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 863.848861] env[60548]: DEBUG nova.virt.hardware [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 863.848978] env[60548]: DEBUG nova.virt.hardware [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 863.851906] env[60548]: DEBUG nova.virt.hardware [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 863.851906] env[60548]: DEBUG nova.virt.hardware [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 863.851906] env[60548]: DEBUG nova.virt.hardware [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 863.851906] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c2ec73b-7465-4c26-a100-93daa55822ba {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.853751] env[60548]: DEBUG nova.network.neutron [req-4c868ff7-24cf-4573-b3af-712cb09bec38 req-9558252a-8533-462d-b02d-b729128daaf4 service nova] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Instance is deleted, no further info cache update {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:106}} [ 863.853889] env[60548]: DEBUG oslo_concurrency.lockutils [req-4c868ff7-24cf-4573-b3af-712cb09bec38 req-9558252a-8533-462d-b02d-b729128daaf4 service nova] Releasing lock "refresh_cache-67cbaf5c-e743-4e07-8f74-c51e4f57914d" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 863.860040] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46a15267-336e-4bbc-a3b1-2f1db230fec8 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.914052] env[60548]: DEBUG nova.policy [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'daf8a085e5194bfc92a5d185789a92da', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a43bd29c3e8a4be6bfe9cbc12cf4c9a8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 865.001087] env[60548]: DEBUG nova.network.neutron [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Successfully created port: e1111eaf-55c7-4a09-b3ce-5ce771553fee {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 865.914067] env[60548]: DEBUG nova.network.neutron [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Successfully updated port: e1111eaf-55c7-4a09-b3ce-5ce771553fee {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 865.925152] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Acquiring lock "refresh_cache-6774e2f5-99d0-4dc9-9ac0-188b35bd68a2" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 865.925152] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Acquired lock "refresh_cache-6774e2f5-99d0-4dc9-9ac0-188b35bd68a2" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 865.925152] env[60548]: DEBUG nova.network.neutron [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 865.929181] env[60548]: DEBUG nova.compute.manager [req-b923616e-8b29-43dd-a066-14d2b54b8ddc req-2b43e7aa-f883-4c17-a091-7dd15270006d service nova] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Received event network-vif-deleted-3d4a9268-2d6e-49e4-a0ba-464aefeef366 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 866.019808] env[60548]: DEBUG nova.network.neutron [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 866.363367] env[60548]: DEBUG nova.network.neutron [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Updating instance_info_cache with network_info: [{"id": "e1111eaf-55c7-4a09-b3ce-5ce771553fee", "address": "fa:16:3e:dd:db:05", "network": {"id": "77db9dc2-3fc7-4e7e-a079-39522a186454", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-831457836-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a43bd29c3e8a4be6bfe9cbc12cf4c9a8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5514c5a3-1294-40ad-ae96-29d5c24a3d95", "external-id": "nsx-vlan-transportzone-179", "segmentation_id": 179, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape1111eaf-55", "ovs_interfaceid": "e1111eaf-55c7-4a09-b3ce-5ce771553fee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 866.377590] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Releasing lock "refresh_cache-6774e2f5-99d0-4dc9-9ac0-188b35bd68a2" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 866.377899] env[60548]: DEBUG nova.compute.manager [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Instance network_info: |[{"id": "e1111eaf-55c7-4a09-b3ce-5ce771553fee", "address": "fa:16:3e:dd:db:05", "network": {"id": "77db9dc2-3fc7-4e7e-a079-39522a186454", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-831457836-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a43bd29c3e8a4be6bfe9cbc12cf4c9a8", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5514c5a3-1294-40ad-ae96-29d5c24a3d95", "external-id": "nsx-vlan-transportzone-179", "segmentation_id": 179, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape1111eaf-55", "ovs_interfaceid": "e1111eaf-55c7-4a09-b3ce-5ce771553fee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 866.378322] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:dd:db:05', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5514c5a3-1294-40ad-ae96-29d5c24a3d95', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e1111eaf-55c7-4a09-b3ce-5ce771553fee', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 866.388136] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Creating folder: Project (a43bd29c3e8a4be6bfe9cbc12cf4c9a8). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 866.388706] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-288f4bc0-18ca-4477-8d14-bf22f39f0826 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 866.400886] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Created folder: Project (a43bd29c3e8a4be6bfe9cbc12cf4c9a8) in parent group-v850287. [ 866.401109] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Creating folder: Instances. Parent ref: group-v850342. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 866.401360] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-09c86ab1-8dd0-46f0-afa7-96fdc3acf56f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 866.411970] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Created folder: Instances in parent group-v850342. [ 866.412242] env[60548]: DEBUG oslo.service.loopingcall [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 866.412431] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 866.413022] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ce3ed1d8-c5a8-4536-a6d5-a23fdc6ae2ea {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 866.437853] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 866.437853] env[60548]: value = "task-4323366" [ 866.437853] env[60548]: _type = "Task" [ 866.437853] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 866.452126] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323366, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 866.951437] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323366, 'name': CreateVM_Task, 'duration_secs': 0.313633} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 866.951877] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 866.952389] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 866.952552] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 866.952865] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 866.953128] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ea4cf6b6-51c8-4e6b-a5bd-07fbcb650d12 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 866.959987] env[60548]: DEBUG oslo_vmware.api [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Waiting for the task: (returnval){ [ 866.959987] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]529c1bcd-06db-bb4e-3669-415493bb23a4" [ 866.959987] env[60548]: _type = "Task" [ 866.959987] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 866.969110] env[60548]: DEBUG oslo_vmware.api [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]529c1bcd-06db-bb4e-3669-415493bb23a4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 867.171940] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 867.475887] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 867.475887] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 867.476149] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 868.174924] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 868.397327] env[60548]: DEBUG nova.compute.manager [req-48de6ad3-63af-47b1-af38-8f7aec75da34 req-4b5941c1-b811-43e8-a5ee-0e3e498a119d service nova] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Received event network-vif-plugged-e1111eaf-55c7-4a09-b3ce-5ce771553fee {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 868.397427] env[60548]: DEBUG oslo_concurrency.lockutils [req-48de6ad3-63af-47b1-af38-8f7aec75da34 req-4b5941c1-b811-43e8-a5ee-0e3e498a119d service nova] Acquiring lock "6774e2f5-99d0-4dc9-9ac0-188b35bd68a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 868.398604] env[60548]: DEBUG oslo_concurrency.lockutils [req-48de6ad3-63af-47b1-af38-8f7aec75da34 req-4b5941c1-b811-43e8-a5ee-0e3e498a119d service nova] Lock "6774e2f5-99d0-4dc9-9ac0-188b35bd68a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 868.398604] env[60548]: DEBUG oslo_concurrency.lockutils [req-48de6ad3-63af-47b1-af38-8f7aec75da34 req-4b5941c1-b811-43e8-a5ee-0e3e498a119d service nova] Lock "6774e2f5-99d0-4dc9-9ac0-188b35bd68a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 868.398604] env[60548]: DEBUG nova.compute.manager [req-48de6ad3-63af-47b1-af38-8f7aec75da34 req-4b5941c1-b811-43e8-a5ee-0e3e498a119d service nova] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] No waiting events found dispatching network-vif-plugged-e1111eaf-55c7-4a09-b3ce-5ce771553fee {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 868.402018] env[60548]: WARNING nova.compute.manager [req-48de6ad3-63af-47b1-af38-8f7aec75da34 req-4b5941c1-b811-43e8-a5ee-0e3e498a119d service nova] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Received unexpected event network-vif-plugged-e1111eaf-55c7-4a09-b3ce-5ce771553fee for instance with vm_state deleted and task_state None. [ 868.402018] env[60548]: DEBUG nova.compute.manager [req-48de6ad3-63af-47b1-af38-8f7aec75da34 req-4b5941c1-b811-43e8-a5ee-0e3e498a119d service nova] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Received event network-changed-e1111eaf-55c7-4a09-b3ce-5ce771553fee {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 868.402018] env[60548]: DEBUG nova.compute.manager [req-48de6ad3-63af-47b1-af38-8f7aec75da34 req-4b5941c1-b811-43e8-a5ee-0e3e498a119d service nova] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Refreshing instance network info cache due to event network-changed-e1111eaf-55c7-4a09-b3ce-5ce771553fee. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 868.402018] env[60548]: DEBUG oslo_concurrency.lockutils [req-48de6ad3-63af-47b1-af38-8f7aec75da34 req-4b5941c1-b811-43e8-a5ee-0e3e498a119d service nova] Acquiring lock "refresh_cache-6774e2f5-99d0-4dc9-9ac0-188b35bd68a2" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 868.402018] env[60548]: DEBUG oslo_concurrency.lockutils [req-48de6ad3-63af-47b1-af38-8f7aec75da34 req-4b5941c1-b811-43e8-a5ee-0e3e498a119d service nova] Acquired lock "refresh_cache-6774e2f5-99d0-4dc9-9ac0-188b35bd68a2" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 868.402018] env[60548]: DEBUG nova.network.neutron [req-48de6ad3-63af-47b1-af38-8f7aec75da34 req-4b5941c1-b811-43e8-a5ee-0e3e498a119d service nova] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Refreshing network info cache for port e1111eaf-55c7-4a09-b3ce-5ce771553fee {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 868.447157] env[60548]: DEBUG nova.network.neutron [req-48de6ad3-63af-47b1-af38-8f7aec75da34 req-4b5941c1-b811-43e8-a5ee-0e3e498a119d service nova] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 868.805246] env[60548]: DEBUG nova.network.neutron [req-48de6ad3-63af-47b1-af38-8f7aec75da34 req-4b5941c1-b811-43e8-a5ee-0e3e498a119d service nova] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Instance is deleted, no further info cache update {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:106}} [ 868.805546] env[60548]: DEBUG oslo_concurrency.lockutils [req-48de6ad3-63af-47b1-af38-8f7aec75da34 req-4b5941c1-b811-43e8-a5ee-0e3e498a119d service nova] Releasing lock "refresh_cache-6774e2f5-99d0-4dc9-9ac0-188b35bd68a2" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 868.805724] env[60548]: DEBUG nova.compute.manager [req-48de6ad3-63af-47b1-af38-8f7aec75da34 req-4b5941c1-b811-43e8-a5ee-0e3e498a119d service nova] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Received event network-vif-deleted-e1111eaf-55c7-4a09-b3ce-5ce771553fee {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 870.171560] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 870.171813] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 870.171980] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 870.172140] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60548) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 871.167464] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 871.196796] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 871.210832] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 871.210832] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 871.211091] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 871.211172] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60548) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 871.212648] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39b93f36-1a61-4d6d-8915-0021144bc62d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.225921] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1010ed6c-6484-4ada-a6ea-46ecda778ae5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.247437] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ba38297-d863-41cf-ab40-20a7aa227277 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.256914] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb62bfeb-5d93-44af-8d37-12b434e687b4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.293857] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180681MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60548) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 871.294109] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 871.294795] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 871.356547] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance b14d8d3f-9253-4e5f-a5b8-09a2f43888a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 871.356714] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 386edc81-5f27-4e44-af7a-f5e47ded1327 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 871.372999] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance c1b6b578-1bed-4e6c-8e7a-34c2e469cd80 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.392280] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance fd3e6440-74fc-4425-9b7e-571245ddc379 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.408108] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance fbf4b0bc-d87a-4063-87ef-e44f72ecbf5c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.422052] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 81c71aa0-9c68-407a-9bef-708c2cb70b12 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.436504] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance bdae41ee-e9c3-4272-9623-ca88464ec45a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.449672] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 3a0515f1-e61a-48d4-980d-49c7189dca2d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.464274] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 30cf201d-7a1c-479c-9040-fba38726d9ab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 871.464274] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 871.464274] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=100GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 871.643033] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcb7463d-b7ad-498e-8fcb-8b00428474cf {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.650850] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c4c2e78-1206-4cd3-b4f5-cc458efdc849 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.686791] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f21055a3-f64c-46ea-9b85-f4a73889537d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.695696] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4769bf36-99d6-4ccd-a50e-816e02370e9f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.712236] env[60548]: DEBUG nova.compute.provider_tree [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 871.722457] env[60548]: DEBUG nova.scheduler.client.report [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 871.750121] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60548) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 871.750353] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.456s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 872.751110] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 873.172856] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 873.172856] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Starting heal instance info cache {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 873.172856] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Rebuilding the list of instances to heal {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 873.190063] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 873.190063] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 873.190063] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Didn't find any instances for network info cache update. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 873.190063] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 875.933041] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Acquiring lock "ad98988d-92aa-4ace-8e40-cd316758002e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 875.933309] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Lock "ad98988d-92aa-4ace-8e40-cd316758002e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 878.372234] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Acquiring lock "585e3015-faef-40df-b3dd-04d2c8e4dd00" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 878.372538] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Lock "585e3015-faef-40df-b3dd-04d2c8e4dd00" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 881.458043] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquiring lock "e3fd811a-186d-436f-bdef-a910a3ccd416" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 881.458043] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Lock "e3fd811a-186d-436f-bdef-a910a3ccd416" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 881.458043] env[60548]: DEBUG oslo_concurrency.lockutils [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Acquiring lock "979c5fe5-051f-4a43-be2f-571aad25a4ae" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 881.458043] env[60548]: DEBUG oslo_concurrency.lockutils [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Lock "979c5fe5-051f-4a43-be2f-571aad25a4ae" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 881.489960] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquiring lock "8f447658-c66d-4d94-af30-fd43c83dae0e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 881.490204] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Lock "8f447658-c66d-4d94-af30-fd43c83dae0e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 882.060660] env[60548]: DEBUG oslo_concurrency.lockutils [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Acquiring lock "e6466fbb-a225-4bbd-839b-f8c4b24d9860" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 882.060660] env[60548]: DEBUG oslo_concurrency.lockutils [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Lock "e6466fbb-a225-4bbd-839b-f8c4b24d9860" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 884.065787] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Acquiring lock "3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 884.066135] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Lock "3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 884.413658] env[60548]: DEBUG oslo_concurrency.lockutils [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Acquiring lock "d0d515a4-15ce-4276-b151-34a8a556a1df" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 884.413902] env[60548]: DEBUG oslo_concurrency.lockutils [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Lock "d0d515a4-15ce-4276-b151-34a8a556a1df" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 906.499623] env[60548]: WARNING oslo_vmware.rw_handles [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 906.499623] env[60548]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 906.499623] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 906.499623] env[60548]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 906.499623] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 906.499623] env[60548]: ERROR oslo_vmware.rw_handles response.begin() [ 906.499623] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 906.499623] env[60548]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 906.499623] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 906.499623] env[60548]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 906.499623] env[60548]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 906.499623] env[60548]: ERROR oslo_vmware.rw_handles [ 906.500222] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Downloaded image file data 5674e50f-0c0c-4f19-8379-104dac34660b to vmware_temp/6930019e-1fbc-46a1-829c-3c0410ee5da2/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 906.501867] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Caching image {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 906.502135] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Copying Virtual Disk [datastore1] vmware_temp/6930019e-1fbc-46a1-829c-3c0410ee5da2/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk to [datastore1] vmware_temp/6930019e-1fbc-46a1-829c-3c0410ee5da2/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk {{(pid=60548) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 906.502444] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-061313df-3f62-486f-a2c1-6d325d244f4d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 906.511600] env[60548]: DEBUG oslo_vmware.api [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Waiting for the task: (returnval){ [ 906.511600] env[60548]: value = "task-4323367" [ 906.511600] env[60548]: _type = "Task" [ 906.511600] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 906.519849] env[60548]: DEBUG oslo_vmware.api [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Task: {'id': task-4323367, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 907.022443] env[60548]: DEBUG oslo_vmware.exceptions [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Fault InvalidArgument not matched. {{(pid=60548) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 907.022709] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 907.023306] env[60548]: ERROR nova.compute.manager [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 907.023306] env[60548]: Faults: ['InvalidArgument'] [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Traceback (most recent call last): [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] yield resources [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] self.driver.spawn(context, instance, image_meta, [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] self._vmops.spawn(context, instance, image_meta, injected_files, [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] self._fetch_image_if_missing(context, vi) [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] image_cache(vi, tmp_image_ds_loc) [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] vm_util.copy_virtual_disk( [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] session._wait_for_task(vmdk_copy_task) [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] return self.wait_for_task(task_ref) [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] return evt.wait() [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] result = hub.switch() [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] return self.greenlet.switch() [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] self.f(*self.args, **self.kw) [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] raise exceptions.translate_fault(task_info.error) [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Faults: ['InvalidArgument'] [ 907.023306] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] [ 907.024101] env[60548]: INFO nova.compute.manager [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Terminating instance [ 907.025255] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 907.025480] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 907.025716] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-78655b30-a9fa-41b3-8250-9ccdbff4fc65 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.028019] env[60548]: DEBUG nova.compute.manager [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 907.028217] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 907.028960] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c87f4a0e-105b-4ce1-9b87-c9fa32e5f18e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.036178] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 907.036400] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a7717366-685e-4228-bc12-4d1692d53a0c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.038640] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 907.038811] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 907.039778] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fd044514-de47-49cf-b584-e4424d47991b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.045258] env[60548]: DEBUG oslo_vmware.api [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Waiting for the task: (returnval){ [ 907.045258] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52cfe749-d8d8-2293-04aa-2720311e795e" [ 907.045258] env[60548]: _type = "Task" [ 907.045258] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 907.056565] env[60548]: DEBUG oslo_vmware.api [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52cfe749-d8d8-2293-04aa-2720311e795e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 907.519188] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 907.519537] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 907.519743] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Deleting the datastore file [datastore1] 386edc81-5f27-4e44-af7a-f5e47ded1327 {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 907.519961] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2826b0a1-dffb-42b9-8dc9-adea96237c3c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.526897] env[60548]: DEBUG oslo_vmware.api [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Waiting for the task: (returnval){ [ 907.526897] env[60548]: value = "task-4323369" [ 907.526897] env[60548]: _type = "Task" [ 907.526897] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 907.535286] env[60548]: DEBUG oslo_vmware.api [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Task: {'id': task-4323369, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 907.554012] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 907.554267] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Creating directory with path [datastore1] vmware_temp/7d8ba19d-590c-4566-aa28-d568fb28159c/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 907.554503] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cf8b53a0-02d0-4ea4-aa25-599d02f3f356 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.574543] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Created directory with path [datastore1] vmware_temp/7d8ba19d-590c-4566-aa28-d568fb28159c/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 907.574750] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Fetch image to [datastore1] vmware_temp/7d8ba19d-590c-4566-aa28-d568fb28159c/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 907.574931] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/7d8ba19d-590c-4566-aa28-d568fb28159c/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 907.575764] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d93224e-7de2-4c2b-bc69-c5c6a06073bb {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.582761] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c754ccb-7ec0-4219-b112-12bb6b757f66 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.592086] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab0b215d-98e9-437d-a851-88f50b69e3f2 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.624428] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c66ba76c-62c1-4e8a-8f5c-2199766c3437 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.631365] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ec20644a-c030-459a-a188-516343fe40d1 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.651683] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 907.698629] env[60548]: DEBUG oslo_vmware.rw_handles [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7d8ba19d-590c-4566-aa28-d568fb28159c/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 907.756699] env[60548]: DEBUG oslo_vmware.rw_handles [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Completed reading data from the image iterator. {{(pid=60548) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 907.756856] env[60548]: DEBUG oslo_vmware.rw_handles [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7d8ba19d-590c-4566-aa28-d568fb28159c/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 908.036520] env[60548]: DEBUG oslo_vmware.api [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Task: {'id': task-4323369, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.095028} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 908.036761] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 908.036943] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 908.037127] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 908.037340] env[60548]: INFO nova.compute.manager [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Took 1.01 seconds to destroy the instance on the hypervisor. [ 908.039533] env[60548]: DEBUG nova.compute.claims [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 908.039772] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 908.040060] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 908.216195] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4eec713e-628e-4ebe-ab6e-fddbca7c7bbb {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.224109] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b7d8a2c-d6c8-4692-ba8b-3295a7a83522 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.253599] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77221e80-23bf-40bc-a6c0-032c12432636 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.261502] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d99af56-be2b-487a-b895-4ff7e7915520 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.274916] env[60548]: DEBUG nova.compute.provider_tree [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 908.283943] env[60548]: DEBUG nova.scheduler.client.report [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 908.297350] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.257s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.297914] env[60548]: ERROR nova.compute.manager [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 908.297914] env[60548]: Faults: ['InvalidArgument'] [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Traceback (most recent call last): [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] self.driver.spawn(context, instance, image_meta, [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] self._vmops.spawn(context, instance, image_meta, injected_files, [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] self._fetch_image_if_missing(context, vi) [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] image_cache(vi, tmp_image_ds_loc) [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] vm_util.copy_virtual_disk( [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] session._wait_for_task(vmdk_copy_task) [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] return self.wait_for_task(task_ref) [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] return evt.wait() [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] result = hub.switch() [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] return self.greenlet.switch() [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] self.f(*self.args, **self.kw) [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] raise exceptions.translate_fault(task_info.error) [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Faults: ['InvalidArgument'] [ 908.297914] env[60548]: ERROR nova.compute.manager [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] [ 908.299433] env[60548]: DEBUG nova.compute.utils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] VimFaultException {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 908.300152] env[60548]: DEBUG nova.compute.manager [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Build of instance 386edc81-5f27-4e44-af7a-f5e47ded1327 was re-scheduled: A specified parameter was not correct: fileType [ 908.300152] env[60548]: Faults: ['InvalidArgument'] {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 908.300519] env[60548]: DEBUG nova.compute.manager [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 908.300687] env[60548]: DEBUG nova.compute.manager [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 908.300837] env[60548]: DEBUG nova.compute.manager [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 908.300995] env[60548]: DEBUG nova.network.neutron [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 908.576218] env[60548]: DEBUG nova.network.neutron [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 908.591101] env[60548]: INFO nova.compute.manager [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Took 0.29 seconds to deallocate network for instance. [ 908.680358] env[60548]: INFO nova.scheduler.client.report [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Deleted allocations for instance 386edc81-5f27-4e44-af7a-f5e47ded1327 [ 908.698595] env[60548]: DEBUG oslo_concurrency.lockutils [None req-63ddf261-dc6a-42d1-841c-f112b680c721 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Lock "386edc81-5f27-4e44-af7a-f5e47ded1327" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 334.022s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.699878] env[60548]: DEBUG oslo_concurrency.lockutils [None req-271f3c6a-ba8d-498c-9457-307bb55b52e0 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Lock "386edc81-5f27-4e44-af7a-f5e47ded1327" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 135.926s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 908.700129] env[60548]: DEBUG oslo_concurrency.lockutils [None req-271f3c6a-ba8d-498c-9457-307bb55b52e0 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Acquiring lock "386edc81-5f27-4e44-af7a-f5e47ded1327-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 908.700334] env[60548]: DEBUG oslo_concurrency.lockutils [None req-271f3c6a-ba8d-498c-9457-307bb55b52e0 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Lock "386edc81-5f27-4e44-af7a-f5e47ded1327-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 908.700502] env[60548]: DEBUG oslo_concurrency.lockutils [None req-271f3c6a-ba8d-498c-9457-307bb55b52e0 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Lock "386edc81-5f27-4e44-af7a-f5e47ded1327-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.702323] env[60548]: INFO nova.compute.manager [None req-271f3c6a-ba8d-498c-9457-307bb55b52e0 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Terminating instance [ 908.704376] env[60548]: DEBUG nova.compute.manager [None req-271f3c6a-ba8d-498c-9457-307bb55b52e0 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 908.704565] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-271f3c6a-ba8d-498c-9457-307bb55b52e0 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 908.705316] env[60548]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0c577461-148e-46ac-b4f9-eba9b88cda65 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.715022] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97dff30f-3bee-4087-b1a0-117b2b5aad5e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.727728] env[60548]: DEBUG nova.compute.manager [None req-adf07f4a-3bca-41f2-9a5f-debef055977c tempest-ServersNegativeTestMultiTenantJSON-819000951 tempest-ServersNegativeTestMultiTenantJSON-819000951-project-member] [instance: a878722d-7e36-4f15-8c5f-bd473375dd9b] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 908.749600] env[60548]: WARNING nova.virt.vmwareapi.vmops [None req-271f3c6a-ba8d-498c-9457-307bb55b52e0 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 386edc81-5f27-4e44-af7a-f5e47ded1327 could not be found. [ 908.749600] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-271f3c6a-ba8d-498c-9457-307bb55b52e0 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 908.749600] env[60548]: INFO nova.compute.manager [None req-271f3c6a-ba8d-498c-9457-307bb55b52e0 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Took 0.04 seconds to destroy the instance on the hypervisor. [ 908.749790] env[60548]: DEBUG oslo.service.loopingcall [None req-271f3c6a-ba8d-498c-9457-307bb55b52e0 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 908.749935] env[60548]: DEBUG nova.compute.manager [-] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 908.750041] env[60548]: DEBUG nova.network.neutron [-] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 908.752574] env[60548]: DEBUG nova.compute.manager [None req-adf07f4a-3bca-41f2-9a5f-debef055977c tempest-ServersNegativeTestMultiTenantJSON-819000951 tempest-ServersNegativeTestMultiTenantJSON-819000951-project-member] [instance: a878722d-7e36-4f15-8c5f-bd473375dd9b] Instance disappeared before build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 908.775283] env[60548]: DEBUG oslo_concurrency.lockutils [None req-adf07f4a-3bca-41f2-9a5f-debef055977c tempest-ServersNegativeTestMultiTenantJSON-819000951 tempest-ServersNegativeTestMultiTenantJSON-819000951-project-member] Lock "a878722d-7e36-4f15-8c5f-bd473375dd9b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 242.832s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.782402] env[60548]: DEBUG nova.network.neutron [-] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 908.783779] env[60548]: DEBUG nova.compute.manager [None req-e5837970-1bd9-42cd-9028-3a8d80878924 tempest-ServerMetadataNegativeTestJSON-2122666783 tempest-ServerMetadataNegativeTestJSON-2122666783-project-member] [instance: deffd52b-d708-4c46-a168-18e80b05b133] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 908.790663] env[60548]: INFO nova.compute.manager [-] [instance: 386edc81-5f27-4e44-af7a-f5e47ded1327] Took 0.04 seconds to deallocate network for instance. [ 908.810597] env[60548]: DEBUG nova.compute.manager [None req-e5837970-1bd9-42cd-9028-3a8d80878924 tempest-ServerMetadataNegativeTestJSON-2122666783 tempest-ServerMetadataNegativeTestJSON-2122666783-project-member] [instance: deffd52b-d708-4c46-a168-18e80b05b133] Instance disappeared before build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 908.835218] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e5837970-1bd9-42cd-9028-3a8d80878924 tempest-ServerMetadataNegativeTestJSON-2122666783 tempest-ServerMetadataNegativeTestJSON-2122666783-project-member] Lock "deffd52b-d708-4c46-a168-18e80b05b133" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 242.103s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.848304] env[60548]: DEBUG nova.compute.manager [None req-34948935-a815-4d21-8ed3-ddfac054bc11 tempest-ServerTagsTestJSON-1582888721 tempest-ServerTagsTestJSON-1582888721-project-member] [instance: bf1694f2-6ad0-4e15-b05d-c73c24e0e955] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 908.869924] env[60548]: DEBUG nova.compute.manager [None req-34948935-a815-4d21-8ed3-ddfac054bc11 tempest-ServerTagsTestJSON-1582888721 tempest-ServerTagsTestJSON-1582888721-project-member] [instance: bf1694f2-6ad0-4e15-b05d-c73c24e0e955] Instance disappeared before build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 908.893488] env[60548]: DEBUG oslo_concurrency.lockutils [None req-271f3c6a-ba8d-498c-9457-307bb55b52e0 tempest-TenantUsagesTestJSON-1233530310 tempest-TenantUsagesTestJSON-1233530310-project-member] Lock "386edc81-5f27-4e44-af7a-f5e47ded1327" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.194s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.901475] env[60548]: DEBUG oslo_concurrency.lockutils [None req-34948935-a815-4d21-8ed3-ddfac054bc11 tempest-ServerTagsTestJSON-1582888721 tempest-ServerTagsTestJSON-1582888721-project-member] Lock "bf1694f2-6ad0-4e15-b05d-c73c24e0e955" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.866s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.922027] env[60548]: DEBUG nova.compute.manager [None req-7c9a00f6-9343-4899-83da-79ed712f6304 tempest-ServerActionsTestOtherB-1079721433 tempest-ServerActionsTestOtherB-1079721433-project-member] [instance: ecc4262d-6133-4541-aec8-fbee05180701] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 908.951629] env[60548]: DEBUG nova.compute.manager [None req-7c9a00f6-9343-4899-83da-79ed712f6304 tempest-ServerActionsTestOtherB-1079721433 tempest-ServerActionsTestOtherB-1079721433-project-member] [instance: ecc4262d-6133-4541-aec8-fbee05180701] Instance disappeared before build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 908.980233] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7c9a00f6-9343-4899-83da-79ed712f6304 tempest-ServerActionsTestOtherB-1079721433 tempest-ServerActionsTestOtherB-1079721433-project-member] Lock "ecc4262d-6133-4541-aec8-fbee05180701" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.597s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.994768] env[60548]: DEBUG nova.compute.manager [None req-eb5badd8-d0c4-4998-b0c5-30e887803cef tempest-ServerActionsV293TestJSON-390462293 tempest-ServerActionsV293TestJSON-390462293-project-member] [instance: c1b6b578-1bed-4e6c-8e7a-34c2e469cd80] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 909.021148] env[60548]: DEBUG nova.compute.manager [None req-eb5badd8-d0c4-4998-b0c5-30e887803cef tempest-ServerActionsV293TestJSON-390462293 tempest-ServerActionsV293TestJSON-390462293-project-member] [instance: c1b6b578-1bed-4e6c-8e7a-34c2e469cd80] Instance disappeared before build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 909.042219] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb5badd8-d0c4-4998-b0c5-30e887803cef tempest-ServerActionsV293TestJSON-390462293 tempest-ServerActionsV293TestJSON-390462293-project-member] Lock "c1b6b578-1bed-4e6c-8e7a-34c2e469cd80" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.697s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 909.051893] env[60548]: DEBUG nova.compute.manager [None req-87481eb0-9c3c-4944-8961-f008b897cc07 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] [instance: fd3e6440-74fc-4425-9b7e-571245ddc379] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 909.073851] env[60548]: DEBUG nova.compute.manager [None req-87481eb0-9c3c-4944-8961-f008b897cc07 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] [instance: fd3e6440-74fc-4425-9b7e-571245ddc379] Instance disappeared before build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 909.094525] env[60548]: DEBUG oslo_concurrency.lockutils [None req-87481eb0-9c3c-4944-8961-f008b897cc07 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Lock "fd3e6440-74fc-4425-9b7e-571245ddc379" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 228.554s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 909.103493] env[60548]: DEBUG nova.compute.manager [None req-dba59053-e9d1-4c49-bf41-31c2f306dde2 tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: fbf4b0bc-d87a-4063-87ef-e44f72ecbf5c] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 909.126897] env[60548]: DEBUG nova.compute.manager [None req-dba59053-e9d1-4c49-bf41-31c2f306dde2 tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: fbf4b0bc-d87a-4063-87ef-e44f72ecbf5c] Instance disappeared before build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 909.149884] env[60548]: DEBUG oslo_concurrency.lockutils [None req-dba59053-e9d1-4c49-bf41-31c2f306dde2 tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Lock "fbf4b0bc-d87a-4063-87ef-e44f72ecbf5c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.728s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 909.159349] env[60548]: DEBUG nova.compute.manager [None req-dba59053-e9d1-4c49-bf41-31c2f306dde2 tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: 81c71aa0-9c68-407a-9bef-708c2cb70b12] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 909.184590] env[60548]: DEBUG nova.compute.manager [None req-dba59053-e9d1-4c49-bf41-31c2f306dde2 tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: 81c71aa0-9c68-407a-9bef-708c2cb70b12] Instance disappeared before build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 909.206811] env[60548]: DEBUG oslo_concurrency.lockutils [None req-dba59053-e9d1-4c49-bf41-31c2f306dde2 tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Lock "81c71aa0-9c68-407a-9bef-708c2cb70b12" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.747s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 909.218102] env[60548]: DEBUG nova.compute.manager [None req-46fa9596-66bc-4024-b931-31353f9cf956 tempest-ServerRescueNegativeTestJSON-12688595 tempest-ServerRescueNegativeTestJSON-12688595-project-member] [instance: bdae41ee-e9c3-4272-9623-ca88464ec45a] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 909.241270] env[60548]: DEBUG nova.compute.manager [None req-46fa9596-66bc-4024-b931-31353f9cf956 tempest-ServerRescueNegativeTestJSON-12688595 tempest-ServerRescueNegativeTestJSON-12688595-project-member] [instance: bdae41ee-e9c3-4272-9623-ca88464ec45a] Instance disappeared before build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 909.262067] env[60548]: DEBUG oslo_concurrency.lockutils [None req-46fa9596-66bc-4024-b931-31353f9cf956 tempest-ServerRescueNegativeTestJSON-12688595 tempest-ServerRescueNegativeTestJSON-12688595-project-member] Lock "bdae41ee-e9c3-4272-9623-ca88464ec45a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.767s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 909.271719] env[60548]: DEBUG nova.compute.manager [None req-cf73e27a-dcd0-427f-b470-9f587fb5df7d tempest-ServerRescueNegativeTestJSON-12688595 tempest-ServerRescueNegativeTestJSON-12688595-project-member] [instance: 3a0515f1-e61a-48d4-980d-49c7189dca2d] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 909.295617] env[60548]: DEBUG nova.compute.manager [None req-cf73e27a-dcd0-427f-b470-9f587fb5df7d tempest-ServerRescueNegativeTestJSON-12688595 tempest-ServerRescueNegativeTestJSON-12688595-project-member] [instance: 3a0515f1-e61a-48d4-980d-49c7189dca2d] Instance disappeared before build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 909.318772] env[60548]: DEBUG oslo_concurrency.lockutils [None req-cf73e27a-dcd0-427f-b470-9f587fb5df7d tempest-ServerRescueNegativeTestJSON-12688595 tempest-ServerRescueNegativeTestJSON-12688595-project-member] Lock "3a0515f1-e61a-48d4-980d-49c7189dca2d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 217.928s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 909.330906] env[60548]: DEBUG nova.compute.manager [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 909.381147] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 909.381409] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 909.382874] env[60548]: INFO nova.compute.claims [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 909.556119] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec694d79-eea3-4a99-a34e-bf5a46a1c803 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.565538] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a90b99fe-d95d-4065-8bbb-490317b66ac0 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.595452] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8325624-de51-4e71-af2b-f520c56b8ddb {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.603461] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f14b21b-88ad-4727-ac5d-800e09f72cbc {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.618229] env[60548]: DEBUG nova.compute.provider_tree [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 909.626850] env[60548]: DEBUG nova.scheduler.client.report [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 909.640122] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.259s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 909.640659] env[60548]: DEBUG nova.compute.manager [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 909.671859] env[60548]: DEBUG nova.compute.utils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 909.673017] env[60548]: DEBUG nova.compute.manager [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 909.673191] env[60548]: DEBUG nova.network.neutron [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 909.682028] env[60548]: DEBUG nova.compute.manager [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 909.749692] env[60548]: DEBUG nova.policy [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'de88047992ae4098acd029bd2bd55b1f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8a91009401dd409ca662573757dfaf88', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 909.757784] env[60548]: DEBUG nova.compute.manager [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 909.781104] env[60548]: DEBUG nova.virt.hardware [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 909.781401] env[60548]: DEBUG nova.virt.hardware [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 909.781558] env[60548]: DEBUG nova.virt.hardware [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 909.781733] env[60548]: DEBUG nova.virt.hardware [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 909.782007] env[60548]: DEBUG nova.virt.hardware [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 909.782196] env[60548]: DEBUG nova.virt.hardware [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 909.782405] env[60548]: DEBUG nova.virt.hardware [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 909.782564] env[60548]: DEBUG nova.virt.hardware [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 909.782724] env[60548]: DEBUG nova.virt.hardware [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 909.782883] env[60548]: DEBUG nova.virt.hardware [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 909.783064] env[60548]: DEBUG nova.virt.hardware [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 909.783930] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f341b9f-5a1d-4abe-bb6d-05f68b959a27 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 909.792228] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3190c5ff-ec82-44de-b3f0-0855a5548c9c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 910.193716] env[60548]: DEBUG nova.network.neutron [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Successfully created port: 5274de6d-19ff-4581-a3b2-246b42ce746a {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 910.801801] env[60548]: DEBUG nova.compute.manager [req-f814072e-d7db-4313-b423-3925807a03fc req-7cd1cd93-b267-449e-9983-72c1cf178784 service nova] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Received event network-vif-plugged-5274de6d-19ff-4581-a3b2-246b42ce746a {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 910.802067] env[60548]: DEBUG oslo_concurrency.lockutils [req-f814072e-d7db-4313-b423-3925807a03fc req-7cd1cd93-b267-449e-9983-72c1cf178784 service nova] Acquiring lock "30cf201d-7a1c-479c-9040-fba38726d9ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 910.802358] env[60548]: DEBUG oslo_concurrency.lockutils [req-f814072e-d7db-4313-b423-3925807a03fc req-7cd1cd93-b267-449e-9983-72c1cf178784 service nova] Lock "30cf201d-7a1c-479c-9040-fba38726d9ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 910.802387] env[60548]: DEBUG oslo_concurrency.lockutils [req-f814072e-d7db-4313-b423-3925807a03fc req-7cd1cd93-b267-449e-9983-72c1cf178784 service nova] Lock "30cf201d-7a1c-479c-9040-fba38726d9ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 910.802548] env[60548]: DEBUG nova.compute.manager [req-f814072e-d7db-4313-b423-3925807a03fc req-7cd1cd93-b267-449e-9983-72c1cf178784 service nova] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] No waiting events found dispatching network-vif-plugged-5274de6d-19ff-4581-a3b2-246b42ce746a {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 910.802713] env[60548]: WARNING nova.compute.manager [req-f814072e-d7db-4313-b423-3925807a03fc req-7cd1cd93-b267-449e-9983-72c1cf178784 service nova] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Received unexpected event network-vif-plugged-5274de6d-19ff-4581-a3b2-246b42ce746a for instance with vm_state building and task_state spawning. [ 910.863260] env[60548]: DEBUG nova.network.neutron [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Successfully updated port: 5274de6d-19ff-4581-a3b2-246b42ce746a {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 910.873663] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquiring lock "refresh_cache-30cf201d-7a1c-479c-9040-fba38726d9ab" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 910.873803] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquired lock "refresh_cache-30cf201d-7a1c-479c-9040-fba38726d9ab" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 910.873949] env[60548]: DEBUG nova.network.neutron [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 910.929226] env[60548]: DEBUG nova.network.neutron [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 911.105614] env[60548]: DEBUG nova.network.neutron [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Updating instance_info_cache with network_info: [{"id": "5274de6d-19ff-4581-a3b2-246b42ce746a", "address": "fa:16:3e:0e:75:33", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5274de6d-19", "ovs_interfaceid": "5274de6d-19ff-4581-a3b2-246b42ce746a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 911.125931] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Releasing lock "refresh_cache-30cf201d-7a1c-479c-9040-fba38726d9ab" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 911.126275] env[60548]: DEBUG nova.compute.manager [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Instance network_info: |[{"id": "5274de6d-19ff-4581-a3b2-246b42ce746a", "address": "fa:16:3e:0e:75:33", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5274de6d-19", "ovs_interfaceid": "5274de6d-19ff-4581-a3b2-246b42ce746a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 911.126646] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0e:75:33', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '15922696-dc08-44ef-97be-0b09a9dfeae8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5274de6d-19ff-4581-a3b2-246b42ce746a', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 911.140503] env[60548]: DEBUG oslo.service.loopingcall [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 911.141142] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 911.141411] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ec40c220-1ad7-4331-b7c3-85fa70a586c9 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 911.164030] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 911.164030] env[60548]: value = "task-4323370" [ 911.164030] env[60548]: _type = "Task" [ 911.164030] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 911.178025] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323370, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 911.675877] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323370, 'name': CreateVM_Task, 'duration_secs': 0.330329} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 911.676431] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 911.677133] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 911.677295] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 911.677659] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 911.677927] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0fcb244c-006b-4040-8d59-55e168057a1f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 911.683184] env[60548]: DEBUG oslo_vmware.api [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Waiting for the task: (returnval){ [ 911.683184] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]527ff2cd-5bda-f584-30ab-70de21b9f1a4" [ 911.683184] env[60548]: _type = "Task" [ 911.683184] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 911.694169] env[60548]: DEBUG oslo_vmware.api [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]527ff2cd-5bda-f584-30ab-70de21b9f1a4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 912.193821] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 912.194097] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 912.194308] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 912.834149] env[60548]: DEBUG nova.compute.manager [req-fcd6cea1-8bc5-47ee-86f8-fef231991f10 req-8dde0979-e4ec-4231-b28f-2d1a37a64260 service nova] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Received event network-changed-5274de6d-19ff-4581-a3b2-246b42ce746a {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 912.834149] env[60548]: DEBUG nova.compute.manager [req-fcd6cea1-8bc5-47ee-86f8-fef231991f10 req-8dde0979-e4ec-4231-b28f-2d1a37a64260 service nova] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Refreshing instance network info cache due to event network-changed-5274de6d-19ff-4581-a3b2-246b42ce746a. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 912.834149] env[60548]: DEBUG oslo_concurrency.lockutils [req-fcd6cea1-8bc5-47ee-86f8-fef231991f10 req-8dde0979-e4ec-4231-b28f-2d1a37a64260 service nova] Acquiring lock "refresh_cache-30cf201d-7a1c-479c-9040-fba38726d9ab" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 912.834149] env[60548]: DEBUG oslo_concurrency.lockutils [req-fcd6cea1-8bc5-47ee-86f8-fef231991f10 req-8dde0979-e4ec-4231-b28f-2d1a37a64260 service nova] Acquired lock "refresh_cache-30cf201d-7a1c-479c-9040-fba38726d9ab" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 912.834149] env[60548]: DEBUG nova.network.neutron [req-fcd6cea1-8bc5-47ee-86f8-fef231991f10 req-8dde0979-e4ec-4231-b28f-2d1a37a64260 service nova] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Refreshing network info cache for port 5274de6d-19ff-4581-a3b2-246b42ce746a {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 912.869158] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Acquiring lock "3a4668ee-e420-4ad8-b638-95b3d55e00c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 912.869158] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Lock "3a4668ee-e420-4ad8-b638-95b3d55e00c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 913.196969] env[60548]: DEBUG nova.network.neutron [req-fcd6cea1-8bc5-47ee-86f8-fef231991f10 req-8dde0979-e4ec-4231-b28f-2d1a37a64260 service nova] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Updated VIF entry in instance network info cache for port 5274de6d-19ff-4581-a3b2-246b42ce746a. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 913.197410] env[60548]: DEBUG nova.network.neutron [req-fcd6cea1-8bc5-47ee-86f8-fef231991f10 req-8dde0979-e4ec-4231-b28f-2d1a37a64260 service nova] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Updating instance_info_cache with network_info: [{"id": "5274de6d-19ff-4581-a3b2-246b42ce746a", "address": "fa:16:3e:0e:75:33", "network": {"id": "3997471b-98d7-4378-a40f-9db897299a3f", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "8c5f3f2bd0c84c96a1b1dc646afca847", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15922696-dc08-44ef-97be-0b09a9dfeae8", "external-id": "nsx-vlan-transportzone-791", "segmentation_id": 791, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5274de6d-19", "ovs_interfaceid": "5274de6d-19ff-4581-a3b2-246b42ce746a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 913.207187] env[60548]: DEBUG oslo_concurrency.lockutils [req-fcd6cea1-8bc5-47ee-86f8-fef231991f10 req-8dde0979-e4ec-4231-b28f-2d1a37a64260 service nova] Releasing lock "refresh_cache-30cf201d-7a1c-479c-9040-fba38726d9ab" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 929.174233] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 929.174604] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 930.172276] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 930.172467] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60548) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 931.171210] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 931.171564] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 931.171769] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 931.182607] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 931.182911] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 931.183163] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 931.183360] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60548) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 931.184681] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2cb1139-6dd0-42b4-9c1e-cbe8f9bfa3c4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.193519] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4257f8c9-d2e6-411e-80c7-cce6231e3b6c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.208014] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-168af00b-9cac-45ba-af30-9e8242cd87c4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.215535] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d36c0eb-118f-4fc2-b5da-a86f2b0ced14 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.245280] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180694MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60548) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 931.245462] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 931.245627] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 931.289111] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance b14d8d3f-9253-4e5f-a5b8-09a2f43888a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 931.289283] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 30cf201d-7a1c-479c-9040-fba38726d9ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 931.301950] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance ad98988d-92aa-4ace-8e40-cd316758002e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 931.313138] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 585e3015-faef-40df-b3dd-04d2c8e4dd00 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 931.323138] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance e3fd811a-186d-436f-bdef-a910a3ccd416 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 931.333107] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 8f447658-c66d-4d94-af30-fd43c83dae0e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 931.344407] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 979c5fe5-051f-4a43-be2f-571aad25a4ae has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 931.355227] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance e6466fbb-a225-4bbd-839b-f8c4b24d9860 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 931.366847] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 931.377788] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance d0d515a4-15ce-4276-b151-34a8a556a1df has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 931.389749] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 3a4668ee-e420-4ad8-b638-95b3d55e00c1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 931.389749] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 931.389749] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=100GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 931.534311] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9a50538-bf25-40c0-bbfd-1592a87b1f6f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.542653] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae9753e1-97be-485c-a5b6-85364871004e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.572448] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55a64780-fb57-4a90-9421-f2ed9a1b631f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.579829] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26586711-afbf-4b75-a144-e7afcf78a0f2 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 931.592991] env[60548]: DEBUG nova.compute.provider_tree [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 931.601891] env[60548]: DEBUG nova.scheduler.client.report [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 931.615330] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60548) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 931.615539] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.370s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 934.611953] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 934.612297] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 934.612419] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Starting heal instance info cache {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 934.612508] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Rebuilding the list of instances to heal {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 934.623912] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 934.624085] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 934.624218] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Didn't find any instances for network info cache update. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 934.624665] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 951.542144] env[60548]: DEBUG nova.compute.manager [req-ee580dce-7fcd-42d7-acc7-b70eb2d1a8a5 req-0e447284-8d1b-4553-a189-36809cf3db37 service nova] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Received event network-vif-deleted-5274de6d-19ff-4581-a3b2-246b42ce746a {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 951.542144] env[60548]: INFO nova.compute.manager [req-ee580dce-7fcd-42d7-acc7-b70eb2d1a8a5 req-0e447284-8d1b-4553-a189-36809cf3db37 service nova] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Neutron deleted interface 5274de6d-19ff-4581-a3b2-246b42ce746a; detaching it from the instance and deleting it from the info cache [ 951.542144] env[60548]: DEBUG nova.network.neutron [req-ee580dce-7fcd-42d7-acc7-b70eb2d1a8a5 req-0e447284-8d1b-4553-a189-36809cf3db37 service nova] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 951.551772] env[60548]: DEBUG oslo_concurrency.lockutils [req-ee580dce-7fcd-42d7-acc7-b70eb2d1a8a5 req-0e447284-8d1b-4553-a189-36809cf3db37 service nova] Acquiring lock "30cf201d-7a1c-479c-9040-fba38726d9ab" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 957.250827] env[60548]: WARNING oslo_vmware.rw_handles [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 957.250827] env[60548]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 957.250827] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 957.250827] env[60548]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 957.250827] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 957.250827] env[60548]: ERROR oslo_vmware.rw_handles response.begin() [ 957.250827] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 957.250827] env[60548]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 957.250827] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 957.250827] env[60548]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 957.250827] env[60548]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 957.250827] env[60548]: ERROR oslo_vmware.rw_handles [ 957.251442] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Downloaded image file data 5674e50f-0c0c-4f19-8379-104dac34660b to vmware_temp/7d8ba19d-590c-4566-aa28-d568fb28159c/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 957.253439] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Caching image {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 957.253745] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Copying Virtual Disk [datastore1] vmware_temp/7d8ba19d-590c-4566-aa28-d568fb28159c/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk to [datastore1] vmware_temp/7d8ba19d-590c-4566-aa28-d568fb28159c/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk {{(pid=60548) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 957.254049] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6cf7df49-35c3-4d28-9d14-e920c1ed0685 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 957.264836] env[60548]: DEBUG oslo_vmware.api [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Waiting for the task: (returnval){ [ 957.264836] env[60548]: value = "task-4323371" [ 957.264836] env[60548]: _type = "Task" [ 957.264836] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 957.272682] env[60548]: DEBUG oslo_vmware.api [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Task: {'id': task-4323371, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 957.775085] env[60548]: DEBUG oslo_vmware.exceptions [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Fault InvalidArgument not matched. {{(pid=60548) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 957.775360] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 957.776014] env[60548]: ERROR nova.compute.manager [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 957.776014] env[60548]: Faults: ['InvalidArgument'] [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Traceback (most recent call last): [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] yield resources [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] self.driver.spawn(context, instance, image_meta, [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] self._fetch_image_if_missing(context, vi) [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] image_cache(vi, tmp_image_ds_loc) [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] vm_util.copy_virtual_disk( [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] session._wait_for_task(vmdk_copy_task) [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] return self.wait_for_task(task_ref) [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] return evt.wait() [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] result = hub.switch() [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] return self.greenlet.switch() [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] self.f(*self.args, **self.kw) [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] raise exceptions.translate_fault(task_info.error) [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Faults: ['InvalidArgument'] [ 957.776014] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] [ 957.776930] env[60548]: INFO nova.compute.manager [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Terminating instance [ 957.777935] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 957.778155] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 957.778396] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c1ae1f8c-0be3-42d7-bbc3-f0f0f3eb3e07 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 957.780719] env[60548]: DEBUG nova.compute.manager [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 957.780911] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 957.781652] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08ce04f1-465b-4cd8-8b7c-9707404c715b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 957.789786] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 957.790050] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-94275da2-f5bc-45de-bc53-a8169a877bc7 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 957.792359] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 957.792531] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 957.793554] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a66423b3-2d26-4565-8674-adc4ea2c00ba {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 957.799322] env[60548]: DEBUG oslo_vmware.api [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Waiting for the task: (returnval){ [ 957.799322] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52e1eb63-d38e-0127-d82d-0284b75de66e" [ 957.799322] env[60548]: _type = "Task" [ 957.799322] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 957.807361] env[60548]: DEBUG oslo_vmware.api [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52e1eb63-d38e-0127-d82d-0284b75de66e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 957.929294] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 957.929521] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 957.929702] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Deleting the datastore file [datastore1] b14d8d3f-9253-4e5f-a5b8-09a2f43888a6 {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 957.929971] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f6721c85-660d-4d45-9144-327e126380de {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 957.937083] env[60548]: DEBUG oslo_vmware.api [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Waiting for the task: (returnval){ [ 957.937083] env[60548]: value = "task-4323373" [ 957.937083] env[60548]: _type = "Task" [ 957.937083] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 957.945213] env[60548]: DEBUG oslo_vmware.api [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Task: {'id': task-4323373, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 958.310334] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 958.310616] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Creating directory with path [datastore1] vmware_temp/604fbd6e-6560-4093-a744-4d3e2250ff67/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 958.310807] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4570f718-e95e-4546-9272-40fea3fbc2f7 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.322733] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Created directory with path [datastore1] vmware_temp/604fbd6e-6560-4093-a744-4d3e2250ff67/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 958.322934] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Fetch image to [datastore1] vmware_temp/604fbd6e-6560-4093-a744-4d3e2250ff67/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 958.323134] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/604fbd6e-6560-4093-a744-4d3e2250ff67/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 958.323899] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f1244b8-ea29-4585-99fb-6fe0b96627a9 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.330979] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a1b9e78-4eeb-4a96-8172-1e75ffcc2e28 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.340434] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c074d138-3ac9-4593-9472-5caa900d76b0 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.371742] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e52be0f-7a7e-4172-9e16-e2e74283cb89 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.377923] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-eada71b5-8d06-4424-ad80-a0bc73c6edbf {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.404495] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 958.446700] env[60548]: DEBUG oslo_vmware.api [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Task: {'id': task-4323373, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.090368} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 958.446927] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 958.447111] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 958.447285] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 958.447457] env[60548]: INFO nova.compute.manager [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Took 0.67 seconds to destroy the instance on the hypervisor. [ 958.449684] env[60548]: DEBUG nova.compute.claims [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 958.449731] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 958.449931] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 958.599587] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 958.601412] env[60548]: ERROR nova.compute.manager [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Traceback (most recent call last): [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] result = getattr(controller, method)(*args, **kwargs) [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return self._get(image_id) [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return RequestIdProxy(wrapped(*args, **kwargs)) [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] resp, body = self.http_client.get(url, headers=header) [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return self.request(url, 'GET', **kwargs) [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return self._handle_response(resp) [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] raise exc.from_response(resp, resp.content) [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] During handling of the above exception, another exception occurred: [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Traceback (most recent call last): [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] yield resources [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] self.driver.spawn(context, instance, image_meta, [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] self._vmops.spawn(context, instance, image_meta, injected_files, [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] self._fetch_image_if_missing(context, vi) [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] image_fetch(context, vi, tmp_image_ds_loc) [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] images.fetch_image( [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 958.601412] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] metadata = IMAGE_API.get(context, image_ref) [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return session.show(context, image_id, [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] _reraise_translated_image_exception(image_id) [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] raise new_exc.with_traceback(exc_trace) [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] result = getattr(controller, method)(*args, **kwargs) [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return self._get(image_id) [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return RequestIdProxy(wrapped(*args, **kwargs)) [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] resp, body = self.http_client.get(url, headers=header) [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return self.request(url, 'GET', **kwargs) [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return self._handle_response(resp) [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] raise exc.from_response(resp, resp.content) [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 958.602603] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 958.602603] env[60548]: INFO nova.compute.manager [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Terminating instance [ 958.603322] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 958.603495] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 958.604136] env[60548]: DEBUG nova.compute.manager [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 958.604384] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 958.604611] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-22d28666-f9f8-4a56-8f1d-d94f10681385 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.607192] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7f32803c-fafc-4e03-bebb-9cabc148b051 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.618568] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 958.619857] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-78c77e86-070d-4b27-bd5a-c464d0f5aad9 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.621533] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 958.621706] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 958.622832] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-98a98625-bbc3-49d4-8b99-6c77076a6638 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.626745] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5391eeb7-7f13-4539-8976-68fd72174ad1 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.646967] env[60548]: DEBUG oslo_vmware.api [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Waiting for the task: (returnval){ [ 958.646967] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52c976f8-f892-ff54-f989-82b98bd1668c" [ 958.646967] env[60548]: _type = "Task" [ 958.646967] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 958.658020] env[60548]: DEBUG oslo_vmware.api [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52c976f8-f892-ff54-f989-82b98bd1668c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 958.660044] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca126ca8-4b5f-4e65-a343-9e9c9120d915 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.690998] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5c103e1-fbe9-40cd-8260-5cefa55cbb02 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.697080] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 958.697346] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 958.697525] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Deleting the datastore file [datastore1] 2751bdfb-2f28-48e0-98c2-f232ed6da6df {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 958.699776] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e6511d77-ec99-4c76-812d-fc59d7c6fa16 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.702662] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-034c051c-0720-441f-b37a-e0e6b7e2d1cf {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 958.718443] env[60548]: DEBUG nova.compute.provider_tree [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 958.721202] env[60548]: DEBUG oslo_vmware.api [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Waiting for the task: (returnval){ [ 958.721202] env[60548]: value = "task-4323375" [ 958.721202] env[60548]: _type = "Task" [ 958.721202] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 958.727239] env[60548]: DEBUG nova.scheduler.client.report [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 958.734143] env[60548]: DEBUG oslo_vmware.api [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Task: {'id': task-4323375, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 958.740245] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.290s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 958.740764] env[60548]: ERROR nova.compute.manager [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 958.740764] env[60548]: Faults: ['InvalidArgument'] [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Traceback (most recent call last): [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] self.driver.spawn(context, instance, image_meta, [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] self._fetch_image_if_missing(context, vi) [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] image_cache(vi, tmp_image_ds_loc) [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] vm_util.copy_virtual_disk( [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] session._wait_for_task(vmdk_copy_task) [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] return self.wait_for_task(task_ref) [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] return evt.wait() [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] result = hub.switch() [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] return self.greenlet.switch() [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] self.f(*self.args, **self.kw) [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] raise exceptions.translate_fault(task_info.error) [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Faults: ['InvalidArgument'] [ 958.740764] env[60548]: ERROR nova.compute.manager [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] [ 958.741760] env[60548]: DEBUG nova.compute.utils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] VimFaultException {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 958.743358] env[60548]: DEBUG nova.compute.manager [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Build of instance b14d8d3f-9253-4e5f-a5b8-09a2f43888a6 was re-scheduled: A specified parameter was not correct: fileType [ 958.743358] env[60548]: Faults: ['InvalidArgument'] {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 958.743775] env[60548]: DEBUG nova.compute.manager [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 958.743973] env[60548]: DEBUG nova.compute.manager [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 958.744170] env[60548]: DEBUG nova.compute.manager [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 958.744346] env[60548]: DEBUG nova.network.neutron [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 959.157204] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 959.157479] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Creating directory with path [datastore1] vmware_temp/9cc434f6-4cb2-40b4-94fb-3d4a449d05e6/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 959.157718] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1f178683-3fd3-4c4d-bd48-728f6911e925 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.175122] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Created directory with path [datastore1] vmware_temp/9cc434f6-4cb2-40b4-94fb-3d4a449d05e6/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 959.175122] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Fetch image to [datastore1] vmware_temp/9cc434f6-4cb2-40b4-94fb-3d4a449d05e6/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 959.175122] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/9cc434f6-4cb2-40b4-94fb-3d4a449d05e6/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 959.175982] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c1b5fc3-19af-4f6d-8a0d-62f14e9545b3 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.184796] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0dee8950-53f2-4614-b141-645ce21a2697 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.194886] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf1a01ae-791e-4350-85dc-03352c50c3a4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.228692] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5392aee9-6c9c-43a5-aeae-e87c1e56016d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.238882] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2f7aec48-ed49-4bf8-8a5d-1a4ddbe98f99 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.243389] env[60548]: DEBUG oslo_vmware.api [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Task: {'id': task-4323375, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.10243} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 959.243653] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 959.243827] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 959.243995] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 959.244171] env[60548]: INFO nova.compute.manager [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Took 0.64 seconds to destroy the instance on the hypervisor. [ 959.246946] env[60548]: DEBUG nova.compute.claims [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 959.246946] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 959.247088] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 959.275429] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 959.276191] env[60548]: DEBUG nova.compute.utils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Instance 2751bdfb-2f28-48e0-98c2-f232ed6da6df could not be found. {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 959.278545] env[60548]: DEBUG nova.compute.manager [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Instance disappeared during build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 959.278742] env[60548]: DEBUG nova.compute.manager [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 959.278953] env[60548]: DEBUG nova.compute.manager [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 959.279152] env[60548]: DEBUG nova.compute.manager [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 959.279315] env[60548]: DEBUG nova.network.neutron [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 959.339710] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 959.381186] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 959.382022] env[60548]: ERROR nova.compute.manager [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Traceback (most recent call last): [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] result = getattr(controller, method)(*args, **kwargs) [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return self._get(image_id) [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return RequestIdProxy(wrapped(*args, **kwargs)) [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] resp, body = self.http_client.get(url, headers=header) [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return self.request(url, 'GET', **kwargs) [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return self._handle_response(resp) [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] raise exc.from_response(resp, resp.content) [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] During handling of the above exception, another exception occurred: [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Traceback (most recent call last): [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] yield resources [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] self.driver.spawn(context, instance, image_meta, [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] self._fetch_image_if_missing(context, vi) [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] image_fetch(context, vi, tmp_image_ds_loc) [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] images.fetch_image( [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 959.382022] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] metadata = IMAGE_API.get(context, image_ref) [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return session.show(context, image_id, [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] _reraise_translated_image_exception(image_id) [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] raise new_exc.with_traceback(exc_trace) [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] result = getattr(controller, method)(*args, **kwargs) [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return self._get(image_id) [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return RequestIdProxy(wrapped(*args, **kwargs)) [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] resp, body = self.http_client.get(url, headers=header) [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return self.request(url, 'GET', **kwargs) [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return self._handle_response(resp) [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] raise exc.from_response(resp, resp.content) [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 959.383116] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 959.383116] env[60548]: INFO nova.compute.manager [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Terminating instance [ 959.384559] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 959.384842] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 959.385538] env[60548]: DEBUG nova.compute.manager [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 959.385729] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 959.385959] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-57f54ed7-0e92-47c0-a259-4fee15df0c73 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.389171] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90907eef-9639-4ae4-83f6-b5a2a85c61f5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.398357] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 959.398489] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f9ab3958-92a8-4995-bc8d-28859db2f55f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.401049] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 959.401235] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 959.402194] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3f1de1e2-511d-4fea-8c04-1262c9806f8c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.409416] env[60548]: DEBUG oslo_vmware.api [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Waiting for the task: (returnval){ [ 959.409416] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]522fec5c-be72-529d-ab67-9ddb43d403eb" [ 959.409416] env[60548]: _type = "Task" [ 959.409416] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 959.417628] env[60548]: DEBUG oslo_vmware.api [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]522fec5c-be72-529d-ab67-9ddb43d403eb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 959.459509] env[60548]: DEBUG neutronclient.v2_0.client [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60548) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 959.462830] env[60548]: ERROR nova.compute.manager [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Traceback (most recent call last): [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] result = getattr(controller, method)(*args, **kwargs) [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return self._get(image_id) [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return RequestIdProxy(wrapped(*args, **kwargs)) [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] resp, body = self.http_client.get(url, headers=header) [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return self.request(url, 'GET', **kwargs) [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return self._handle_response(resp) [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] raise exc.from_response(resp, resp.content) [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] During handling of the above exception, another exception occurred: [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Traceback (most recent call last): [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] self.driver.spawn(context, instance, image_meta, [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] self._vmops.spawn(context, instance, image_meta, injected_files, [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] self._fetch_image_if_missing(context, vi) [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] image_fetch(context, vi, tmp_image_ds_loc) [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] images.fetch_image( [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] metadata = IMAGE_API.get(context, image_ref) [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return session.show(context, image_id, [ 959.462830] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] _reraise_translated_image_exception(image_id) [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] raise new_exc.with_traceback(exc_trace) [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] result = getattr(controller, method)(*args, **kwargs) [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return self._get(image_id) [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return RequestIdProxy(wrapped(*args, **kwargs)) [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] resp, body = self.http_client.get(url, headers=header) [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return self.request(url, 'GET', **kwargs) [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return self._handle_response(resp) [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] raise exc.from_response(resp, resp.content) [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] During handling of the above exception, another exception occurred: [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Traceback (most recent call last): [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] self._build_and_run_instance(context, instance, image, [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] with excutils.save_and_reraise_exception(): [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] self.force_reraise() [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] raise self.value [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] with self.rt.instance_claim(context, instance, node, allocs, [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] self.abort() [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 959.464228] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return f(*args, **kwargs) [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] self._unset_instance_host_and_node(instance) [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] instance.save() [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] updates, result = self.indirection_api.object_action( [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return cctxt.call(context, 'object_action', objinst=objinst, [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] result = self.transport._send( [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return self._driver.send(target, ctxt, message, [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] raise result [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] nova.exception_Remote.InstanceNotFound_Remote: Instance 2751bdfb-2f28-48e0-98c2-f232ed6da6df could not be found. [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Traceback (most recent call last): [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return getattr(target, method)(*args, **kwargs) [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return fn(self, *args, **kwargs) [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] old_ref, inst_ref = db.instance_update_and_get_original( [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return f(*args, **kwargs) [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] with excutils.save_and_reraise_exception() as ectxt: [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] self.force_reraise() [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] raise self.value [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return f(*args, **kwargs) [ 959.465685] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return f(context, *args, **kwargs) [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] raise exception.InstanceNotFound(instance_id=uuid) [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] nova.exception.InstanceNotFound: Instance 2751bdfb-2f28-48e0-98c2-f232ed6da6df could not be found. [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] During handling of the above exception, another exception occurred: [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Traceback (most recent call last): [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] ret = obj(*args, **kwargs) [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] exception_handler_v20(status_code, error_body) [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] raise client_exc(message=error_message, [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Neutron server returns request_ids: ['req-21d6970b-1eca-42ae-b0c1-ac53c0368f02'] [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] During handling of the above exception, another exception occurred: [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Traceback (most recent call last): [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] self._deallocate_network(context, instance, requested_networks) [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] self.network_api.deallocate_for_instance( [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] data = neutron.list_ports(**search_opts) [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] ret = obj(*args, **kwargs) [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return self.list('ports', self.ports_path, retrieve_all, [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] ret = obj(*args, **kwargs) [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 959.467622] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] for r in self._pagination(collection, path, **params): [ 959.469176] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 959.469176] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] res = self.get(path, params=params) [ 959.469176] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 959.469176] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] ret = obj(*args, **kwargs) [ 959.469176] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 959.469176] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return self.retry_request("GET", action, body=body, [ 959.469176] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 959.469176] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] ret = obj(*args, **kwargs) [ 959.469176] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 959.469176] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] return self.do_request(method, action, body=body, [ 959.469176] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 959.469176] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] ret = obj(*args, **kwargs) [ 959.469176] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 959.469176] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] self._handle_fault_response(status_code, replybody, resp) [ 959.469176] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 959.469176] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] raise exception.Unauthorized() [ 959.469176] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] nova.exception.Unauthorized: Not authorized. [ 959.469176] env[60548]: ERROR nova.compute.manager [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] [ 959.479962] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 959.480496] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 959.480496] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Deleting the datastore file [datastore1] 83ecd8bb-ba2b-4151-986b-26f50b54e8e2 {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 959.480656] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7469850e-b481-4a6d-ba1c-5add3a6bb413 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.487578] env[60548]: DEBUG oslo_vmware.api [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Waiting for the task: (returnval){ [ 959.487578] env[60548]: value = "task-4323377" [ 959.487578] env[60548]: _type = "Task" [ 959.487578] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 959.488678] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d509edda-321c-4de0-9ca5-0b05c6399b67 tempest-InstanceActionsV221TestJSON-166999117 tempest-InstanceActionsV221TestJSON-166999117-project-member] Lock "2751bdfb-2f28-48e0-98c2-f232ed6da6df" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 323.506s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 959.499350] env[60548]: DEBUG oslo_vmware.api [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Task: {'id': task-4323377, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 959.500398] env[60548]: DEBUG nova.compute.manager [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 959.528555] env[60548]: DEBUG nova.network.neutron [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 959.549299] env[60548]: INFO nova.compute.manager [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Took 0.80 seconds to deallocate network for instance. [ 959.563637] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 959.563906] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 959.567210] env[60548]: INFO nova.compute.claims [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 959.634316] env[60548]: INFO nova.scheduler.client.report [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Deleted allocations for instance b14d8d3f-9253-4e5f-a5b8-09a2f43888a6 [ 959.656876] env[60548]: DEBUG oslo_concurrency.lockutils [None req-b94bcd83-3081-4fd8-8d31-960a52aff5d3 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Lock "b14d8d3f-9253-4e5f-a5b8-09a2f43888a6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 385.742s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 959.658532] env[60548]: DEBUG oslo_concurrency.lockutils [None req-bdab44a4-5913-468d-90cb-59a4ea100d42 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Lock "b14d8d3f-9253-4e5f-a5b8-09a2f43888a6" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 187.980s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 959.658776] env[60548]: DEBUG oslo_concurrency.lockutils [None req-bdab44a4-5913-468d-90cb-59a4ea100d42 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Acquiring lock "b14d8d3f-9253-4e5f-a5b8-09a2f43888a6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 959.658975] env[60548]: DEBUG oslo_concurrency.lockutils [None req-bdab44a4-5913-468d-90cb-59a4ea100d42 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Lock "b14d8d3f-9253-4e5f-a5b8-09a2f43888a6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 959.659149] env[60548]: DEBUG oslo_concurrency.lockutils [None req-bdab44a4-5913-468d-90cb-59a4ea100d42 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Lock "b14d8d3f-9253-4e5f-a5b8-09a2f43888a6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 959.661324] env[60548]: INFO nova.compute.manager [None req-bdab44a4-5913-468d-90cb-59a4ea100d42 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Terminating instance [ 959.663855] env[60548]: DEBUG nova.compute.manager [None req-bdab44a4-5913-468d-90cb-59a4ea100d42 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 959.663855] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-bdab44a4-5913-468d-90cb-59a4ea100d42 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 959.663855] env[60548]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-39721a87-c390-4b6e-919a-987cb3d56310 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.677020] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2290d459-5e48-4e02-bc7d-61bf1e4f2282 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.688483] env[60548]: DEBUG nova.compute.manager [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 959.713485] env[60548]: WARNING nova.virt.vmwareapi.vmops [None req-bdab44a4-5913-468d-90cb-59a4ea100d42 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b14d8d3f-9253-4e5f-a5b8-09a2f43888a6 could not be found. [ 959.713687] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-bdab44a4-5913-468d-90cb-59a4ea100d42 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 959.713858] env[60548]: INFO nova.compute.manager [None req-bdab44a4-5913-468d-90cb-59a4ea100d42 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Took 0.05 seconds to destroy the instance on the hypervisor. [ 959.714133] env[60548]: DEBUG oslo.service.loopingcall [None req-bdab44a4-5913-468d-90cb-59a4ea100d42 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 959.716817] env[60548]: DEBUG nova.compute.manager [-] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 959.716927] env[60548]: DEBUG nova.network.neutron [-] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 959.741859] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 959.742631] env[60548]: DEBUG nova.network.neutron [-] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 959.752813] env[60548]: INFO nova.compute.manager [-] [instance: b14d8d3f-9253-4e5f-a5b8-09a2f43888a6] Took 0.04 seconds to deallocate network for instance. [ 959.772424] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbe4d6cf-2495-4f23-8270-5f51781c4aed {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.780617] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1c1643a-e2f1-402d-bcd3-4487961d0b6e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.815529] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92381ff6-1607-43ce-810f-0265b0cf772b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.825174] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfd789f6-18df-4145-83fb-48ac675327d0 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.839061] env[60548]: DEBUG nova.compute.provider_tree [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 959.846870] env[60548]: DEBUG nova.scheduler.client.report [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 959.865310] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.301s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 959.865829] env[60548]: DEBUG nova.compute.manager [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 959.868440] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.127s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 959.869816] env[60548]: INFO nova.compute.claims [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 959.874205] env[60548]: DEBUG oslo_concurrency.lockutils [None req-bdab44a4-5913-468d-90cb-59a4ea100d42 tempest-ServersTestMultiNic-1533444800 tempest-ServersTestMultiNic-1533444800-project-member] Lock "b14d8d3f-9253-4e5f-a5b8-09a2f43888a6" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.216s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 959.903062] env[60548]: DEBUG nova.compute.utils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 959.904676] env[60548]: DEBUG nova.compute.manager [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 959.904848] env[60548]: DEBUG nova.network.neutron [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 959.914605] env[60548]: DEBUG nova.compute.manager [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 959.923788] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 959.924073] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Creating directory with path [datastore1] vmware_temp/c1836b96-da0b-4f34-a498-cce05c6bd24b/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 959.924246] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-74b564aa-b2c3-4b78-86e0-6f22f6d50a31 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.936637] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Created directory with path [datastore1] vmware_temp/c1836b96-da0b-4f34-a498-cce05c6bd24b/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 959.936843] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Fetch image to [datastore1] vmware_temp/c1836b96-da0b-4f34-a498-cce05c6bd24b/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 959.937016] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/c1836b96-da0b-4f34-a498-cce05c6bd24b/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 959.940182] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0416a765-49d7-4e2e-8d7a-59418d74ad93 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.951979] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b195d56e-003a-4ed0-8f6a-b1aeb3070e0b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.964957] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44080c3b-f38c-4aab-b0ea-3f65bbd62446 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 959.973223] env[60548]: DEBUG nova.policy [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cac1cb748fa746608741e950ceadb1b8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e5343838b9d64cf0aeb72c368f89eea7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 959.976466] env[60548]: DEBUG nova.compute.manager [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 960.010563] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cce4f72-ac73-4a87-80b6-7a66c2e7a93d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.020956] env[60548]: DEBUG oslo_vmware.api [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Task: {'id': task-4323377, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.08164} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 960.023907] env[60548]: DEBUG nova.virt.hardware [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 960.023907] env[60548]: DEBUG nova.virt.hardware [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 960.023907] env[60548]: DEBUG nova.virt.hardware [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 960.023907] env[60548]: DEBUG nova.virt.hardware [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 960.023907] env[60548]: DEBUG nova.virt.hardware [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 960.024222] env[60548]: DEBUG nova.virt.hardware [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 960.024222] env[60548]: DEBUG nova.virt.hardware [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 960.024461] env[60548]: DEBUG nova.virt.hardware [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 960.024792] env[60548]: DEBUG nova.virt.hardware [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 960.024792] env[60548]: DEBUG nova.virt.hardware [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 960.024876] env[60548]: DEBUG nova.virt.hardware [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 960.025103] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5fd1ca45-74db-46eb-8c48-f73acd360d4c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.027046] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 960.027236] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 960.027408] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 960.027666] env[60548]: INFO nova.compute.manager [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Took 0.64 seconds to destroy the instance on the hypervisor. [ 960.029607] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-594902f2-252f-4c98-ac00-742b8b71a1d6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.034849] env[60548]: DEBUG nova.compute.claims [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 960.035034] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 960.041721] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ecb9fa6-1f05-41b9-8a23-593783f32286 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.062357] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 960.112370] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fa7ba0f-f204-4e36-a470-94799861150b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.116344] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67ba890b-332b-4089-93f5-ff7add6d3c18 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.153629] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd75e3eb-9d55-47a8-9317-00ba2ad2f88a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.162864] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-893adb2c-b1f2-4203-9c0c-b2601b062dea {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.177649] env[60548]: DEBUG nova.compute.provider_tree [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 960.189753] env[60548]: DEBUG nova.scheduler.client.report [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 960.206654] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.338s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 960.207244] env[60548]: DEBUG nova.compute.manager [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 960.209828] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.174s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 960.216484] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 960.219456] env[60548]: ERROR nova.compute.manager [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Traceback (most recent call last): [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] result = getattr(controller, method)(*args, **kwargs) [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return self._get(image_id) [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return RequestIdProxy(wrapped(*args, **kwargs)) [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] resp, body = self.http_client.get(url, headers=header) [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return self.request(url, 'GET', **kwargs) [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return self._handle_response(resp) [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] raise exc.from_response(resp, resp.content) [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] During handling of the above exception, another exception occurred: [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Traceback (most recent call last): [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] yield resources [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] self.driver.spawn(context, instance, image_meta, [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] self._vmops.spawn(context, instance, image_meta, injected_files, [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] self._fetch_image_if_missing(context, vi) [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] image_fetch(context, vi, tmp_image_ds_loc) [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] images.fetch_image( [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 960.219456] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] metadata = IMAGE_API.get(context, image_ref) [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return session.show(context, image_id, [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] _reraise_translated_image_exception(image_id) [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] raise new_exc.with_traceback(exc_trace) [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] result = getattr(controller, method)(*args, **kwargs) [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return self._get(image_id) [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return RequestIdProxy(wrapped(*args, **kwargs)) [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] resp, body = self.http_client.get(url, headers=header) [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return self.request(url, 'GET', **kwargs) [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return self._handle_response(resp) [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] raise exc.from_response(resp, resp.content) [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 960.220803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 960.220803] env[60548]: INFO nova.compute.manager [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Terminating instance [ 960.220803] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 960.220803] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 960.220803] env[60548]: DEBUG nova.compute.manager [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 960.220803] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 960.220803] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-304db99a-fc3a-4560-a3ed-309bebf89ddd {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.223366] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4abb3cb2-32c6-4264-8dd2-44ce2fc4db1f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.231460] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 960.231731] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-50ac81ce-c1b4-46d9-8040-0dc6fce0388a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.234368] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 960.234541] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 960.235806] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6ab655f9-b112-49f1-a663-a095ed666e12 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.239057] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.030s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 960.239801] env[60548]: DEBUG nova.compute.utils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Instance 83ecd8bb-ba2b-4151-986b-26f50b54e8e2 could not be found. {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 960.242396] env[60548]: DEBUG oslo_vmware.api [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Waiting for the task: (returnval){ [ 960.242396] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52135b9a-a4b5-02ce-18f8-0a727aaaf32b" [ 960.242396] env[60548]: _type = "Task" [ 960.242396] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 960.242817] env[60548]: DEBUG nova.compute.manager [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Instance disappeared during build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 960.242979] env[60548]: DEBUG nova.compute.manager [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 960.243177] env[60548]: DEBUG nova.compute.manager [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 960.243347] env[60548]: DEBUG nova.compute.manager [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 960.243509] env[60548]: DEBUG nova.network.neutron [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 960.259099] env[60548]: DEBUG nova.compute.utils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 960.264383] env[60548]: DEBUG nova.compute.manager [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 960.264383] env[60548]: DEBUG nova.network.neutron [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 960.267237] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 960.267480] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Creating directory with path [datastore1] vmware_temp/f5b6f99a-1cdf-42ec-927f-31dcfb449ccb/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 960.268101] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-52dd64cb-42e1-43d8-8819-9c18cc518698 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.270921] env[60548]: DEBUG nova.compute.manager [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 960.296453] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Created directory with path [datastore1] vmware_temp/f5b6f99a-1cdf-42ec-927f-31dcfb449ccb/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 960.296453] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Fetch image to [datastore1] vmware_temp/f5b6f99a-1cdf-42ec-927f-31dcfb449ccb/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 960.296453] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/f5b6f99a-1cdf-42ec-927f-31dcfb449ccb/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 960.296453] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4466153b-10a3-4c05-b50d-0361f29326e5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.303519] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f4022a0-bf1f-4f86-b650-0c2865984892 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.317021] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb015d17-2949-462b-bc1e-f0b6e1b36241 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.322384] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 960.322626] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 960.322943] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Deleting the datastore file [datastore1] afb2cdc1-74ec-4d08-85cb-e96b4071f661 {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 960.323830] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-078fc6ab-69bd-4f3e-9e5d-b675a36334e9 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.361465] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9789094f-b065-42a8-81aa-86ca2be46fc2 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.364037] env[60548]: DEBUG oslo_vmware.api [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Waiting for the task: (returnval){ [ 960.364037] env[60548]: value = "task-4323379" [ 960.364037] env[60548]: _type = "Task" [ 960.364037] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 960.370524] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-38468e92-52c6-4e39-a91e-e0f709f1e72c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.377206] env[60548]: DEBUG nova.compute.manager [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 960.378872] env[60548]: DEBUG oslo_vmware.api [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Task: {'id': task-4323379, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 960.384258] env[60548]: DEBUG nova.policy [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7d367cd15b484e3bad1edfd948ee2d38', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3c6e9283a81a4cc197709fd070c13c34', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 960.401146] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 960.429257] env[60548]: DEBUG nova.network.neutron [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Successfully created port: 59582163-0304-4f82-9a45-39db58047179 {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 960.437518] env[60548]: DEBUG nova.virt.hardware [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 960.437518] env[60548]: DEBUG nova.virt.hardware [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 960.437518] env[60548]: DEBUG nova.virt.hardware [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 960.437518] env[60548]: DEBUG nova.virt.hardware [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 960.437518] env[60548]: DEBUG nova.virt.hardware [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 960.437518] env[60548]: DEBUG nova.virt.hardware [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 960.438389] env[60548]: DEBUG nova.virt.hardware [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 960.438389] env[60548]: DEBUG nova.virt.hardware [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 960.438389] env[60548]: DEBUG nova.virt.hardware [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 960.438389] env[60548]: DEBUG nova.virt.hardware [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 960.438389] env[60548]: DEBUG nova.virt.hardware [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 960.439638] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0716b9b2-96f6-420e-a629-8bd9d5e11a08 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.442840] env[60548]: DEBUG neutronclient.v2_0.client [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60548) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 960.444653] env[60548]: ERROR nova.compute.manager [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Traceback (most recent call last): [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] result = getattr(controller, method)(*args, **kwargs) [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return self._get(image_id) [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return RequestIdProxy(wrapped(*args, **kwargs)) [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] resp, body = self.http_client.get(url, headers=header) [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return self.request(url, 'GET', **kwargs) [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return self._handle_response(resp) [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] raise exc.from_response(resp, resp.content) [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] During handling of the above exception, another exception occurred: [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Traceback (most recent call last): [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] self.driver.spawn(context, instance, image_meta, [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] self._fetch_image_if_missing(context, vi) [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] image_fetch(context, vi, tmp_image_ds_loc) [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] images.fetch_image( [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] metadata = IMAGE_API.get(context, image_ref) [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return session.show(context, image_id, [ 960.444653] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] _reraise_translated_image_exception(image_id) [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] raise new_exc.with_traceback(exc_trace) [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] result = getattr(controller, method)(*args, **kwargs) [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return self._get(image_id) [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return RequestIdProxy(wrapped(*args, **kwargs)) [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] resp, body = self.http_client.get(url, headers=header) [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return self.request(url, 'GET', **kwargs) [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return self._handle_response(resp) [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] raise exc.from_response(resp, resp.content) [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] During handling of the above exception, another exception occurred: [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Traceback (most recent call last): [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] self._build_and_run_instance(context, instance, image, [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] with excutils.save_and_reraise_exception(): [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] self.force_reraise() [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] raise self.value [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] with self.rt.instance_claim(context, instance, node, allocs, [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] self.abort() [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 960.445587] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return f(*args, **kwargs) [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] self._unset_instance_host_and_node(instance) [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] instance.save() [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] updates, result = self.indirection_api.object_action( [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return cctxt.call(context, 'object_action', objinst=objinst, [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] result = self.transport._send( [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return self._driver.send(target, ctxt, message, [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] raise result [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] nova.exception_Remote.InstanceNotFound_Remote: Instance 83ecd8bb-ba2b-4151-986b-26f50b54e8e2 could not be found. [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Traceback (most recent call last): [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return getattr(target, method)(*args, **kwargs) [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return fn(self, *args, **kwargs) [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] old_ref, inst_ref = db.instance_update_and_get_original( [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return f(*args, **kwargs) [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] with excutils.save_and_reraise_exception() as ectxt: [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] self.force_reraise() [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] raise self.value [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return f(*args, **kwargs) [ 960.446576] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return f(context, *args, **kwargs) [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] raise exception.InstanceNotFound(instance_id=uuid) [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] nova.exception.InstanceNotFound: Instance 83ecd8bb-ba2b-4151-986b-26f50b54e8e2 could not be found. [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] During handling of the above exception, another exception occurred: [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Traceback (most recent call last): [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] ret = obj(*args, **kwargs) [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] exception_handler_v20(status_code, error_body) [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] raise client_exc(message=error_message, [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Neutron server returns request_ids: ['req-5d9e85b3-bc5e-483e-b170-c406d94f7cae'] [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] During handling of the above exception, another exception occurred: [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Traceback (most recent call last): [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] self._deallocate_network(context, instance, requested_networks) [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] self.network_api.deallocate_for_instance( [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] data = neutron.list_ports(**search_opts) [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] ret = obj(*args, **kwargs) [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return self.list('ports', self.ports_path, retrieve_all, [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] ret = obj(*args, **kwargs) [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 960.447825] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] for r in self._pagination(collection, path, **params): [ 960.450532] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 960.450532] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] res = self.get(path, params=params) [ 960.450532] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 960.450532] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] ret = obj(*args, **kwargs) [ 960.450532] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 960.450532] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return self.retry_request("GET", action, body=body, [ 960.450532] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 960.450532] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] ret = obj(*args, **kwargs) [ 960.450532] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 960.450532] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] return self.do_request(method, action, body=body, [ 960.450532] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 960.450532] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] ret = obj(*args, **kwargs) [ 960.450532] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 960.450532] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] self._handle_fault_response(status_code, replybody, resp) [ 960.450532] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 960.450532] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] raise exception.Unauthorized() [ 960.450532] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] nova.exception.Unauthorized: Not authorized. [ 960.450532] env[60548]: ERROR nova.compute.manager [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] [ 960.455102] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad978fa7-c780-44e5-a68e-6c007483b7a5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.476667] env[60548]: DEBUG oslo_concurrency.lockutils [None req-4b7be5c0-f76b-41bf-a3bd-40a889aab22d tempest-ServersV294TestFqdnHostnames-45939237 tempest-ServersV294TestFqdnHostnames-45939237-project-member] Lock "83ecd8bb-ba2b-4151-986b-26f50b54e8e2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 323.936s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 960.489648] env[60548]: DEBUG nova.compute.manager [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 960.555320] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 960.555711] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 960.557513] env[60548]: INFO nova.compute.claims [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 960.572103] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 960.573051] env[60548]: ERROR nova.compute.manager [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Traceback (most recent call last): [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] result = getattr(controller, method)(*args, **kwargs) [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return self._get(image_id) [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return RequestIdProxy(wrapped(*args, **kwargs)) [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] resp, body = self.http_client.get(url, headers=header) [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return self.request(url, 'GET', **kwargs) [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return self._handle_response(resp) [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] raise exc.from_response(resp, resp.content) [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] During handling of the above exception, another exception occurred: [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Traceback (most recent call last): [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] yield resources [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] self.driver.spawn(context, instance, image_meta, [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] self._fetch_image_if_missing(context, vi) [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] image_fetch(context, vi, tmp_image_ds_loc) [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] images.fetch_image( [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 960.573051] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] metadata = IMAGE_API.get(context, image_ref) [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return session.show(context, image_id, [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] _reraise_translated_image_exception(image_id) [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] raise new_exc.with_traceback(exc_trace) [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] result = getattr(controller, method)(*args, **kwargs) [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return self._get(image_id) [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return RequestIdProxy(wrapped(*args, **kwargs)) [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] resp, body = self.http_client.get(url, headers=header) [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return self.request(url, 'GET', **kwargs) [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return self._handle_response(resp) [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] raise exc.from_response(resp, resp.content) [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 960.573997] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 960.573997] env[60548]: INFO nova.compute.manager [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Terminating instance [ 960.575855] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 960.576653] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 960.577420] env[60548]: DEBUG nova.compute.manager [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 960.577710] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 960.578129] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-75f6c342-b572-46b7-8994-ee4053f37411 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.581404] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-580f8bd5-4b8c-463b-bda0-a12a374ce919 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.589889] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 960.590293] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-220cf432-c0d9-4f53-af77-e5e72d69f262 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.593066] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 960.593434] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 960.594638] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9dc851ef-46ca-4e7a-9aae-d8c16d00c69a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.603224] env[60548]: DEBUG oslo_vmware.api [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Waiting for the task: (returnval){ [ 960.603224] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]520d83e8-952b-42d1-313b-4458de909735" [ 960.603224] env[60548]: _type = "Task" [ 960.603224] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 960.615018] env[60548]: DEBUG oslo_vmware.api [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]520d83e8-952b-42d1-313b-4458de909735, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 960.658114] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 960.658114] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 960.658114] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Deleting the datastore file [datastore1] 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5 {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 960.658114] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-96044072-bfc3-4a3c-b17e-b5a5a3262112 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.668817] env[60548]: DEBUG oslo_vmware.api [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Waiting for the task: (returnval){ [ 960.668817] env[60548]: value = "task-4323381" [ 960.668817] env[60548]: _type = "Task" [ 960.668817] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 960.679679] env[60548]: DEBUG oslo_vmware.api [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Task: {'id': task-4323381, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 960.735547] env[60548]: DEBUG nova.network.neutron [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Successfully created port: a70f9e4e-ccf1-4a44-aeea-89644226015c {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 960.788098] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0157c64b-7d8b-496c-b867-b9101fe6cff9 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.796871] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6548a28-4809-4640-9f0f-0a5c33d4cf36 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.833457] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1eb3aa6-54c7-4aaa-a71e-13f51a41c881 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.846019] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e233501a-19b1-4bc4-a210-3068ec710b21 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.857811] env[60548]: DEBUG nova.compute.provider_tree [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 960.869989] env[60548]: DEBUG nova.scheduler.client.report [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 960.879566] env[60548]: DEBUG oslo_vmware.api [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Task: {'id': task-4323379, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073547} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 960.879925] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 960.880314] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 960.880389] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 960.880594] env[60548]: INFO nova.compute.manager [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Took 0.66 seconds to destroy the instance on the hypervisor. [ 960.882683] env[60548]: DEBUG nova.compute.claims [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 960.882932] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 960.897834] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.342s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 960.898474] env[60548]: DEBUG nova.compute.manager [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 960.900975] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.018s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 960.926479] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 960.927355] env[60548]: DEBUG nova.compute.utils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Instance afb2cdc1-74ec-4d08-85cb-e96b4071f661 could not be found. {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 960.929818] env[60548]: DEBUG nova.compute.manager [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Instance disappeared during build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 960.930164] env[60548]: DEBUG nova.compute.manager [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 960.930423] env[60548]: DEBUG nova.compute.manager [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 960.930648] env[60548]: DEBUG nova.compute.manager [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 960.930876] env[60548]: DEBUG nova.network.neutron [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 960.934716] env[60548]: DEBUG nova.compute.utils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 960.936447] env[60548]: DEBUG nova.compute.manager [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 960.936684] env[60548]: DEBUG nova.network.neutron [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 960.943939] env[60548]: DEBUG nova.compute.manager [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 961.025172] env[60548]: DEBUG nova.compute.manager [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 961.034596] env[60548]: DEBUG nova.policy [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c8a348958df440918e0f0fa5e923f9ea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e712a58321b3496d851d463771623d15', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 961.092993] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 961.093270] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 961.093423] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 961.093595] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 961.093735] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 961.093877] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 961.094414] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 961.094693] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 961.094968] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 961.095232] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 961.095469] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 961.096816] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b1014c9-1331-4780-810d-e57fe161ce1c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.109999] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08630884-aec9-4256-9e7c-1f183775ad6e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.140015] env[60548]: DEBUG neutronclient.v2_0.client [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60548) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 961.141814] env[60548]: ERROR nova.compute.manager [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Traceback (most recent call last): [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] result = getattr(controller, method)(*args, **kwargs) [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return self._get(image_id) [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return RequestIdProxy(wrapped(*args, **kwargs)) [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] resp, body = self.http_client.get(url, headers=header) [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return self.request(url, 'GET', **kwargs) [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return self._handle_response(resp) [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] raise exc.from_response(resp, resp.content) [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] During handling of the above exception, another exception occurred: [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Traceback (most recent call last): [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] self.driver.spawn(context, instance, image_meta, [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] self._vmops.spawn(context, instance, image_meta, injected_files, [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] self._fetch_image_if_missing(context, vi) [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] image_fetch(context, vi, tmp_image_ds_loc) [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] images.fetch_image( [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] metadata = IMAGE_API.get(context, image_ref) [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return session.show(context, image_id, [ 961.141814] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] _reraise_translated_image_exception(image_id) [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] raise new_exc.with_traceback(exc_trace) [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] result = getattr(controller, method)(*args, **kwargs) [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return self._get(image_id) [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return RequestIdProxy(wrapped(*args, **kwargs)) [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] resp, body = self.http_client.get(url, headers=header) [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return self.request(url, 'GET', **kwargs) [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return self._handle_response(resp) [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] raise exc.from_response(resp, resp.content) [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] During handling of the above exception, another exception occurred: [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Traceback (most recent call last): [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] self._build_and_run_instance(context, instance, image, [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] with excutils.save_and_reraise_exception(): [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] self.force_reraise() [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] raise self.value [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] with self.rt.instance_claim(context, instance, node, allocs, [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] self.abort() [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 961.142803] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return f(*args, **kwargs) [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] self._unset_instance_host_and_node(instance) [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] instance.save() [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] updates, result = self.indirection_api.object_action( [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return cctxt.call(context, 'object_action', objinst=objinst, [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] result = self.transport._send( [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return self._driver.send(target, ctxt, message, [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] raise result [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] nova.exception_Remote.InstanceNotFound_Remote: Instance afb2cdc1-74ec-4d08-85cb-e96b4071f661 could not be found. [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Traceback (most recent call last): [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return getattr(target, method)(*args, **kwargs) [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return fn(self, *args, **kwargs) [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] old_ref, inst_ref = db.instance_update_and_get_original( [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return f(*args, **kwargs) [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] with excutils.save_and_reraise_exception() as ectxt: [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] self.force_reraise() [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] raise self.value [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return f(*args, **kwargs) [ 961.143842] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return f(context, *args, **kwargs) [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] raise exception.InstanceNotFound(instance_id=uuid) [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] nova.exception.InstanceNotFound: Instance afb2cdc1-74ec-4d08-85cb-e96b4071f661 could not be found. [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] During handling of the above exception, another exception occurred: [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Traceback (most recent call last): [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] ret = obj(*args, **kwargs) [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] exception_handler_v20(status_code, error_body) [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] raise client_exc(message=error_message, [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Neutron server returns request_ids: ['req-040a30e5-6a1d-4d22-a18f-f925a4004a96'] [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] During handling of the above exception, another exception occurred: [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Traceback (most recent call last): [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] self._deallocate_network(context, instance, requested_networks) [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] self.network_api.deallocate_for_instance( [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] data = neutron.list_ports(**search_opts) [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] ret = obj(*args, **kwargs) [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return self.list('ports', self.ports_path, retrieve_all, [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] ret = obj(*args, **kwargs) [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 961.144904] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] for r in self._pagination(collection, path, **params): [ 961.150598] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 961.150598] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] res = self.get(path, params=params) [ 961.150598] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 961.150598] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] ret = obj(*args, **kwargs) [ 961.150598] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 961.150598] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return self.retry_request("GET", action, body=body, [ 961.150598] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 961.150598] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] ret = obj(*args, **kwargs) [ 961.150598] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 961.150598] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] return self.do_request(method, action, body=body, [ 961.150598] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 961.150598] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] ret = obj(*args, **kwargs) [ 961.150598] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 961.150598] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] self._handle_fault_response(status_code, replybody, resp) [ 961.150598] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 961.150598] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] raise exception.Unauthorized() [ 961.150598] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] nova.exception.Unauthorized: Not authorized. [ 961.150598] env[60548]: ERROR nova.compute.manager [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] [ 961.150598] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 961.150598] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Creating directory with path [datastore1] vmware_temp/1514c933-f867-4f8f-8d41-52be8b3652eb/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 961.150598] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-180da534-da85-4dd2-affe-eca2fb4c0122 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.158635] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Created directory with path [datastore1] vmware_temp/1514c933-f867-4f8f-8d41-52be8b3652eb/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 961.158894] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Fetch image to [datastore1] vmware_temp/1514c933-f867-4f8f-8d41-52be8b3652eb/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 961.159052] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/1514c933-f867-4f8f-8d41-52be8b3652eb/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 961.159936] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-537e7b26-cfae-494f-9475-58c4e9b38c69 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.170825] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1969a8dd-d6d3-42b3-b1c2-5ebc77c3c23d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.195049] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-579fad21-53dd-4ffe-82ba-cefc080652c8 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.198297] env[60548]: DEBUG oslo_vmware.api [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Task: {'id': task-4323381, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076198} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 961.199720] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0c8ec332-5019-4190-91b8-9f74d2637704 tempest-VolumesAdminNegativeTest-1171589411 tempest-VolumesAdminNegativeTest-1171589411-project-member] Lock "afb2cdc1-74ec-4d08-85cb-e96b4071f661" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 314.371s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 961.200347] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 961.200612] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 961.200864] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 961.201098] env[60548]: INFO nova.compute.manager [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Took 0.62 seconds to destroy the instance on the hypervisor. [ 961.204169] env[60548]: DEBUG nova.compute.claims [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 961.204377] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 961.204642] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 961.237280] env[60548]: DEBUG nova.compute.manager [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 961.239794] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.035s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 961.240595] env[60548]: DEBUG nova.compute.utils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Instance 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5 could not be found. {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 961.244020] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-618f1eb5-3267-40be-a6f2-38956ef94022 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.245083] env[60548]: DEBUG nova.compute.manager [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Instance disappeared during build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 961.245296] env[60548]: DEBUG nova.compute.manager [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 961.245498] env[60548]: DEBUG nova.compute.manager [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 961.245676] env[60548]: DEBUG nova.compute.manager [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 961.245870] env[60548]: DEBUG nova.network.neutron [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 961.252323] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fd0c7632-cab9-4a5c-8bfa-d45cda17e8bf {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.278347] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 961.296533] env[60548]: DEBUG oslo_concurrency.lockutils [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 961.296899] env[60548]: DEBUG oslo_concurrency.lockutils [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 961.298823] env[60548]: INFO nova.compute.claims [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 961.314933] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 961.315866] env[60548]: ERROR nova.compute.manager [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Traceback (most recent call last): [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] result = getattr(controller, method)(*args, **kwargs) [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return self._get(image_id) [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return RequestIdProxy(wrapped(*args, **kwargs)) [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] resp, body = self.http_client.get(url, headers=header) [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return self.request(url, 'GET', **kwargs) [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return self._handle_response(resp) [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] raise exc.from_response(resp, resp.content) [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] During handling of the above exception, another exception occurred: [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Traceback (most recent call last): [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] yield resources [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] self.driver.spawn(context, instance, image_meta, [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] self._vmops.spawn(context, instance, image_meta, injected_files, [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] self._fetch_image_if_missing(context, vi) [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] image_fetch(context, vi, tmp_image_ds_loc) [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] images.fetch_image( [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 961.315866] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] metadata = IMAGE_API.get(context, image_ref) [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return session.show(context, image_id, [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] _reraise_translated_image_exception(image_id) [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] raise new_exc.with_traceback(exc_trace) [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] result = getattr(controller, method)(*args, **kwargs) [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return self._get(image_id) [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return RequestIdProxy(wrapped(*args, **kwargs)) [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] resp, body = self.http_client.get(url, headers=header) [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return self.request(url, 'GET', **kwargs) [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return self._handle_response(resp) [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] raise exc.from_response(resp, resp.content) [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 961.317418] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 961.317418] env[60548]: INFO nova.compute.manager [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Terminating instance [ 961.318044] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 961.318878] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 961.318878] env[60548]: DEBUG nova.compute.manager [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 961.318878] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 961.319187] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5f228bca-af54-4d90-98a4-4418ef80ae32 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.321809] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7602bfe-1d0e-4d5c-be79-b7ca713ff25a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.331315] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 961.332617] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4df89c76-ef65-40fe-b4a7-6f12a1190dc6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.334582] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 961.334793] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 961.335549] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-70e59824-82f1-414a-99ac-a27182433e4b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.341586] env[60548]: DEBUG oslo_vmware.api [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Waiting for the task: (returnval){ [ 961.341586] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52fed18a-e6e6-1fc1-cc50-488b70194826" [ 961.341586] env[60548]: _type = "Task" [ 961.341586] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 961.355387] env[60548]: DEBUG oslo_vmware.api [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52fed18a-e6e6-1fc1-cc50-488b70194826, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 961.417939] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 961.417939] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 961.418323] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Deleting the datastore file [datastore1] be11788c-634f-40c0-8c8c-d6253d0e68ad {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 961.418323] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2946eca2-bd2d-4d52-afb0-649617958555 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.427563] env[60548]: DEBUG oslo_vmware.api [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Waiting for the task: (returnval){ [ 961.427563] env[60548]: value = "task-4323383" [ 961.427563] env[60548]: _type = "Task" [ 961.427563] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 961.437324] env[60548]: DEBUG oslo_vmware.api [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Task: {'id': task-4323383, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 961.450358] env[60548]: DEBUG neutronclient.v2_0.client [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60548) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 961.452284] env[60548]: ERROR nova.compute.manager [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Traceback (most recent call last): [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] result = getattr(controller, method)(*args, **kwargs) [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return self._get(image_id) [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return RequestIdProxy(wrapped(*args, **kwargs)) [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] resp, body = self.http_client.get(url, headers=header) [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return self.request(url, 'GET', **kwargs) [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return self._handle_response(resp) [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] raise exc.from_response(resp, resp.content) [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] During handling of the above exception, another exception occurred: [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Traceback (most recent call last): [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] self.driver.spawn(context, instance, image_meta, [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] self._fetch_image_if_missing(context, vi) [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] image_fetch(context, vi, tmp_image_ds_loc) [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] images.fetch_image( [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] metadata = IMAGE_API.get(context, image_ref) [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return session.show(context, image_id, [ 961.452284] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] _reraise_translated_image_exception(image_id) [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] raise new_exc.with_traceback(exc_trace) [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] result = getattr(controller, method)(*args, **kwargs) [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return self._get(image_id) [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return RequestIdProxy(wrapped(*args, **kwargs)) [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] resp, body = self.http_client.get(url, headers=header) [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return self.request(url, 'GET', **kwargs) [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return self._handle_response(resp) [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] raise exc.from_response(resp, resp.content) [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] During handling of the above exception, another exception occurred: [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Traceback (most recent call last): [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] self._build_and_run_instance(context, instance, image, [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] with excutils.save_and_reraise_exception(): [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] self.force_reraise() [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] raise self.value [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] with self.rt.instance_claim(context, instance, node, allocs, [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] self.abort() [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 961.453897] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return f(*args, **kwargs) [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] self._unset_instance_host_and_node(instance) [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] instance.save() [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] updates, result = self.indirection_api.object_action( [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return cctxt.call(context, 'object_action', objinst=objinst, [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] result = self.transport._send( [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return self._driver.send(target, ctxt, message, [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] raise result [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] nova.exception_Remote.InstanceNotFound_Remote: Instance 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5 could not be found. [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Traceback (most recent call last): [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return getattr(target, method)(*args, **kwargs) [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return fn(self, *args, **kwargs) [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] old_ref, inst_ref = db.instance_update_and_get_original( [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return f(*args, **kwargs) [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] with excutils.save_and_reraise_exception() as ectxt: [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] self.force_reraise() [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] raise self.value [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return f(*args, **kwargs) [ 961.454820] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return f(context, *args, **kwargs) [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] raise exception.InstanceNotFound(instance_id=uuid) [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] nova.exception.InstanceNotFound: Instance 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5 could not be found. [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] During handling of the above exception, another exception occurred: [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Traceback (most recent call last): [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] ret = obj(*args, **kwargs) [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] exception_handler_v20(status_code, error_body) [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] raise client_exc(message=error_message, [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Neutron server returns request_ids: ['req-31b115a3-d653-4f5b-acaa-616e5dfe794c'] [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] During handling of the above exception, another exception occurred: [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Traceback (most recent call last): [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] self._deallocate_network(context, instance, requested_networks) [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] self.network_api.deallocate_for_instance( [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] data = neutron.list_ports(**search_opts) [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] ret = obj(*args, **kwargs) [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return self.list('ports', self.ports_path, retrieve_all, [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] ret = obj(*args, **kwargs) [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 961.456324] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] for r in self._pagination(collection, path, **params): [ 961.457385] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 961.457385] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] res = self.get(path, params=params) [ 961.457385] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 961.457385] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] ret = obj(*args, **kwargs) [ 961.457385] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 961.457385] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return self.retry_request("GET", action, body=body, [ 961.457385] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 961.457385] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] ret = obj(*args, **kwargs) [ 961.457385] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 961.457385] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] return self.do_request(method, action, body=body, [ 961.457385] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 961.457385] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] ret = obj(*args, **kwargs) [ 961.457385] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 961.457385] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] self._handle_fault_response(status_code, replybody, resp) [ 961.457385] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 961.457385] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] raise exception.Unauthorized() [ 961.457385] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] nova.exception.Unauthorized: Not authorized. [ 961.457385] env[60548]: ERROR nova.compute.manager [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] [ 961.491374] env[60548]: DEBUG oslo_concurrency.lockutils [None req-eb90076f-127d-4345-af44-bfa8dc28fd29 tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Lock "2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 306.705s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 961.506963] env[60548]: DEBUG nova.compute.manager [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 961.548343] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-527055b8-ded4-463a-8845-841709cc3f98 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.555988] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afbe15f3-3993-4973-97e8-928b6adde94d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.599458] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 961.600245] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3b72a47-6dc6-4cc4-a856-43c0138c2db4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.608883] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1930fa9b-8366-4eff-9889-75eae17fdaf6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.623645] env[60548]: DEBUG nova.compute.provider_tree [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 961.633498] env[60548]: DEBUG nova.scheduler.client.report [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 961.650875] env[60548]: DEBUG oslo_concurrency.lockutils [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.354s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 961.651409] env[60548]: DEBUG nova.compute.manager [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 961.654490] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.055s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 961.656042] env[60548]: INFO nova.compute.claims [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 961.703162] env[60548]: DEBUG nova.compute.utils [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 961.704544] env[60548]: DEBUG nova.compute.manager [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 961.704712] env[60548]: DEBUG nova.network.neutron [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 961.726630] env[60548]: DEBUG nova.compute.manager [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 961.797397] env[60548]: DEBUG nova.compute.manager [req-9b29c653-3b9c-4af0-bd73-f2ddf6b1b885 req-8a429013-dd08-4519-9959-42ab245caecb service nova] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Received event network-vif-plugged-a70f9e4e-ccf1-4a44-aeea-89644226015c {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 961.797397] env[60548]: DEBUG oslo_concurrency.lockutils [req-9b29c653-3b9c-4af0-bd73-f2ddf6b1b885 req-8a429013-dd08-4519-9959-42ab245caecb service nova] Acquiring lock "585e3015-faef-40df-b3dd-04d2c8e4dd00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 961.797397] env[60548]: DEBUG oslo_concurrency.lockutils [req-9b29c653-3b9c-4af0-bd73-f2ddf6b1b885 req-8a429013-dd08-4519-9959-42ab245caecb service nova] Lock "585e3015-faef-40df-b3dd-04d2c8e4dd00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 961.797397] env[60548]: DEBUG oslo_concurrency.lockutils [req-9b29c653-3b9c-4af0-bd73-f2ddf6b1b885 req-8a429013-dd08-4519-9959-42ab245caecb service nova] Lock "585e3015-faef-40df-b3dd-04d2c8e4dd00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 961.797397] env[60548]: DEBUG nova.compute.manager [req-9b29c653-3b9c-4af0-bd73-f2ddf6b1b885 req-8a429013-dd08-4519-9959-42ab245caecb service nova] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] No waiting events found dispatching network-vif-plugged-a70f9e4e-ccf1-4a44-aeea-89644226015c {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 961.797397] env[60548]: WARNING nova.compute.manager [req-9b29c653-3b9c-4af0-bd73-f2ddf6b1b885 req-8a429013-dd08-4519-9959-42ab245caecb service nova] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Received unexpected event network-vif-plugged-a70f9e4e-ccf1-4a44-aeea-89644226015c for instance with vm_state building and task_state spawning. [ 961.822248] env[60548]: DEBUG nova.policy [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd122dc69e91c4228826688c92f76b66d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b69a6d4834e24a3cbc4019ee66c0d841', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 961.841436] env[60548]: DEBUG nova.network.neutron [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Successfully created port: 0fecdaa6-f624-45a1-b74d-c1443e12532d {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 961.852286] env[60548]: DEBUG nova.compute.manager [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 961.861393] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 961.861490] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Creating directory with path [datastore1] vmware_temp/ccdaf143-4b0d-43f7-8f18-07b7b827ed20/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 961.861682] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a49e4398-0c15-4974-837b-52511408a94e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.875064] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Created directory with path [datastore1] vmware_temp/ccdaf143-4b0d-43f7-8f18-07b7b827ed20/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 961.875365] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Fetch image to [datastore1] vmware_temp/ccdaf143-4b0d-43f7-8f18-07b7b827ed20/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 961.875463] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/ccdaf143-4b0d-43f7-8f18-07b7b827ed20/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 961.876244] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9c97680-20d2-4fd9-a134-76d7663dc939 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.881024] env[60548]: DEBUG nova.virt.hardware [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 961.881290] env[60548]: DEBUG nova.virt.hardware [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 961.881487] env[60548]: DEBUG nova.virt.hardware [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 961.881615] env[60548]: DEBUG nova.virt.hardware [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 961.881756] env[60548]: DEBUG nova.virt.hardware [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 961.881894] env[60548]: DEBUG nova.virt.hardware [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 961.882177] env[60548]: DEBUG nova.virt.hardware [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 961.882256] env[60548]: DEBUG nova.virt.hardware [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 961.882409] env[60548]: DEBUG nova.virt.hardware [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 961.882565] env[60548]: DEBUG nova.virt.hardware [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 961.882734] env[60548]: DEBUG nova.virt.hardware [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 961.883589] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4af0564-a82a-4994-a2f7-1fea9b9a8935 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.897856] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b1d4904-ec84-496f-977c-cf1d0135787e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.904247] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e42de81b-fe97-4b1d-9556-7a6ba675e9d2 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.929937] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1aacf3c-b53d-4d6b-92c1-d7994664b61b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.934632] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-724a10f7-5313-425c-8383-6f7bb70fc8a6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.973644] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf90e6cb-2907-4ce3-bda0-d07da6978f42 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.977471] env[60548]: DEBUG oslo_vmware.api [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Task: {'id': task-4323383, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081368} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 961.978218] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0411f68-8707-47e7-b7ca-3a5249f42b1d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.980609] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 961.980790] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 961.980957] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 961.981145] env[60548]: INFO nova.compute.manager [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Took 0.66 seconds to destroy the instance on the hypervisor. [ 961.983361] env[60548]: DEBUG nova.compute.claims [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 961.983549] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 962.013720] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcea312d-9e50-4527-ade8-5560bfbd3dd6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.015502] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3274b630-34e1-471b-b760-9bd0d676e5b5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.023184] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2470ea8e-6a10-48a8-baeb-b956e1df3c88 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.040440] env[60548]: DEBUG nova.compute.provider_tree [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 962.044954] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 962.048824] env[60548]: DEBUG nova.network.neutron [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Successfully updated port: a70f9e4e-ccf1-4a44-aeea-89644226015c {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 962.051413] env[60548]: DEBUG nova.scheduler.client.report [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 962.060434] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Acquiring lock "refresh_cache-585e3015-faef-40df-b3dd-04d2c8e4dd00" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 962.060808] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Acquired lock "refresh_cache-585e3015-faef-40df-b3dd-04d2c8e4dd00" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 962.060808] env[60548]: DEBUG nova.network.neutron [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 962.071480] env[60548]: DEBUG nova.network.neutron [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Successfully updated port: 59582163-0304-4f82-9a45-39db58047179 {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 962.074814] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.420s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 962.075277] env[60548]: DEBUG nova.compute.manager [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 962.080892] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.095s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 962.083509] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Acquiring lock "refresh_cache-ad98988d-92aa-4ace-8e40-cd316758002e" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 962.083734] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Acquired lock "refresh_cache-ad98988d-92aa-4ace-8e40-cd316758002e" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 962.083988] env[60548]: DEBUG nova.network.neutron [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 962.119353] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.041s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 962.121998] env[60548]: DEBUG nova.compute.utils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Instance be11788c-634f-40c0-8c8c-d6253d0e68ad could not be found. {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 962.122648] env[60548]: DEBUG nova.compute.manager [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Instance disappeared during build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 962.123053] env[60548]: DEBUG nova.compute.manager [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 962.123461] env[60548]: DEBUG nova.compute.manager [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 962.123935] env[60548]: DEBUG nova.compute.manager [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 962.124237] env[60548]: DEBUG nova.network.neutron [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 962.128036] env[60548]: DEBUG nova.compute.utils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 962.128633] env[60548]: DEBUG nova.compute.manager [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 962.128856] env[60548]: DEBUG nova.network.neutron [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 962.143215] env[60548]: DEBUG nova.compute.manager [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 962.182794] env[60548]: DEBUG nova.network.neutron [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 962.223172] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 962.224144] env[60548]: ERROR nova.compute.manager [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Traceback (most recent call last): [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] result = getattr(controller, method)(*args, **kwargs) [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return self._get(image_id) [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return RequestIdProxy(wrapped(*args, **kwargs)) [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] resp, body = self.http_client.get(url, headers=header) [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return self.request(url, 'GET', **kwargs) [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return self._handle_response(resp) [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] raise exc.from_response(resp, resp.content) [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] During handling of the above exception, another exception occurred: [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Traceback (most recent call last): [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] yield resources [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] self.driver.spawn(context, instance, image_meta, [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] self._fetch_image_if_missing(context, vi) [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] image_fetch(context, vi, tmp_image_ds_loc) [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] images.fetch_image( [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 962.224144] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] metadata = IMAGE_API.get(context, image_ref) [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return session.show(context, image_id, [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] _reraise_translated_image_exception(image_id) [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] raise new_exc.with_traceback(exc_trace) [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] result = getattr(controller, method)(*args, **kwargs) [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return self._get(image_id) [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return RequestIdProxy(wrapped(*args, **kwargs)) [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] resp, body = self.http_client.get(url, headers=header) [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return self.request(url, 'GET', **kwargs) [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return self._handle_response(resp) [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] raise exc.from_response(resp, resp.content) [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 962.225219] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 962.225219] env[60548]: INFO nova.compute.manager [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Terminating instance [ 962.226333] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 962.226548] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 962.230029] env[60548]: DEBUG nova.compute.manager [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 962.230029] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 962.230604] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-08c54b5c-41fe-4789-b028-91381322a5f0 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.233498] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7502b8ab-bd50-4951-a72c-d34ba14a9be8 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.238051] env[60548]: DEBUG nova.compute.manager [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 962.247934] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 962.249465] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9e1a1e39-9410-4430-8787-a2d0d68fa0a8 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.251683] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 962.251867] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 962.252626] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-728ddd69-ec22-4698-84c3-933cdd21b9f8 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.260179] env[60548]: DEBUG oslo_vmware.api [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Waiting for the task: (returnval){ [ 962.260179] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52680a7a-0fbc-226c-3085-ef423c96ecfd" [ 962.260179] env[60548]: _type = "Task" [ 962.260179] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 962.270059] env[60548]: DEBUG oslo_vmware.api [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52680a7a-0fbc-226c-3085-ef423c96ecfd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 962.319892] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 962.320516] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 962.320516] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 962.320516] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 962.320646] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 962.320766] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 962.320976] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 962.321171] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 962.321433] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 962.321503] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 962.321676] env[60548]: DEBUG nova.virt.hardware [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 962.322591] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e44b15cc-792f-4d2c-ba5f-bfe0be816031 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.326896] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 962.327125] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 962.327285] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Deleting the datastore file [datastore1] 306f3cb9-3028-4ff2-8090-2c9c1c72efc1 {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 962.327960] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4e0bb5b4-bbb9-4563-810b-10b77db832d5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.333892] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48954150-2135-46cc-a2b9-d0806279f1e1 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.338561] env[60548]: DEBUG nova.network.neutron [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 962.341777] env[60548]: DEBUG oslo_vmware.api [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Waiting for the task: (returnval){ [ 962.341777] env[60548]: value = "task-4323385" [ 962.341777] env[60548]: _type = "Task" [ 962.341777] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 962.359554] env[60548]: DEBUG oslo_vmware.api [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Task: {'id': task-4323385, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 962.408097] env[60548]: DEBUG neutronclient.v2_0.client [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60548) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 962.409635] env[60548]: ERROR nova.compute.manager [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Traceback (most recent call last): [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] result = getattr(controller, method)(*args, **kwargs) [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return self._get(image_id) [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return RequestIdProxy(wrapped(*args, **kwargs)) [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] resp, body = self.http_client.get(url, headers=header) [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return self.request(url, 'GET', **kwargs) [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return self._handle_response(resp) [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] raise exc.from_response(resp, resp.content) [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] During handling of the above exception, another exception occurred: [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Traceback (most recent call last): [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] self.driver.spawn(context, instance, image_meta, [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] self._vmops.spawn(context, instance, image_meta, injected_files, [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] self._fetch_image_if_missing(context, vi) [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] image_fetch(context, vi, tmp_image_ds_loc) [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] images.fetch_image( [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] metadata = IMAGE_API.get(context, image_ref) [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return session.show(context, image_id, [ 962.409635] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] _reraise_translated_image_exception(image_id) [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] raise new_exc.with_traceback(exc_trace) [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] result = getattr(controller, method)(*args, **kwargs) [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return self._get(image_id) [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return RequestIdProxy(wrapped(*args, **kwargs)) [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] resp, body = self.http_client.get(url, headers=header) [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return self.request(url, 'GET', **kwargs) [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return self._handle_response(resp) [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] raise exc.from_response(resp, resp.content) [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] During handling of the above exception, another exception occurred: [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Traceback (most recent call last): [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] self._build_and_run_instance(context, instance, image, [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] with excutils.save_and_reraise_exception(): [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] self.force_reraise() [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] raise self.value [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] with self.rt.instance_claim(context, instance, node, allocs, [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] self.abort() [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 962.410869] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return f(*args, **kwargs) [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] self._unset_instance_host_and_node(instance) [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] instance.save() [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] updates, result = self.indirection_api.object_action( [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return cctxt.call(context, 'object_action', objinst=objinst, [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] result = self.transport._send( [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return self._driver.send(target, ctxt, message, [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] raise result [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] nova.exception_Remote.InstanceNotFound_Remote: Instance be11788c-634f-40c0-8c8c-d6253d0e68ad could not be found. [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Traceback (most recent call last): [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return getattr(target, method)(*args, **kwargs) [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return fn(self, *args, **kwargs) [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] old_ref, inst_ref = db.instance_update_and_get_original( [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return f(*args, **kwargs) [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] with excutils.save_and_reraise_exception() as ectxt: [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] self.force_reraise() [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] raise self.value [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return f(*args, **kwargs) [ 962.412211] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return f(context, *args, **kwargs) [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] raise exception.InstanceNotFound(instance_id=uuid) [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] nova.exception.InstanceNotFound: Instance be11788c-634f-40c0-8c8c-d6253d0e68ad could not be found. [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] During handling of the above exception, another exception occurred: [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Traceback (most recent call last): [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] ret = obj(*args, **kwargs) [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] exception_handler_v20(status_code, error_body) [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] raise client_exc(message=error_message, [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Neutron server returns request_ids: ['req-442a275b-ae5e-4c3f-b198-f18bf4be13e1'] [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] During handling of the above exception, another exception occurred: [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Traceback (most recent call last): [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] self._deallocate_network(context, instance, requested_networks) [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] self.network_api.deallocate_for_instance( [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] data = neutron.list_ports(**search_opts) [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] ret = obj(*args, **kwargs) [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return self.list('ports', self.ports_path, retrieve_all, [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] ret = obj(*args, **kwargs) [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 962.413363] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] for r in self._pagination(collection, path, **params): [ 962.414384] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 962.414384] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] res = self.get(path, params=params) [ 962.414384] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 962.414384] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] ret = obj(*args, **kwargs) [ 962.414384] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 962.414384] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return self.retry_request("GET", action, body=body, [ 962.414384] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 962.414384] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] ret = obj(*args, **kwargs) [ 962.414384] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 962.414384] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] return self.do_request(method, action, body=body, [ 962.414384] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 962.414384] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] ret = obj(*args, **kwargs) [ 962.414384] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 962.414384] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] self._handle_fault_response(status_code, replybody, resp) [ 962.414384] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 962.414384] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] raise exception.Unauthorized() [ 962.414384] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] nova.exception.Unauthorized: Not authorized. [ 962.414384] env[60548]: ERROR nova.compute.manager [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] [ 962.432868] env[60548]: DEBUG oslo_concurrency.lockutils [None req-742b174a-c430-401c-9610-59b04ea9602b tempest-SecurityGroupsTestJSON-1924620447 tempest-SecurityGroupsTestJSON-1924620447-project-member] Lock "be11788c-634f-40c0-8c8c-d6253d0e68ad" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 306.958s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 962.443982] env[60548]: DEBUG nova.compute.manager [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 962.474311] env[60548]: DEBUG nova.policy [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c8a348958df440918e0f0fa5e923f9ea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e712a58321b3496d851d463771623d15', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 962.499704] env[60548]: DEBUG oslo_concurrency.lockutils [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 962.499954] env[60548]: DEBUG oslo_concurrency.lockutils [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 962.501806] env[60548]: INFO nova.compute.claims [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 962.519526] env[60548]: DEBUG nova.network.neutron [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Updating instance_info_cache with network_info: [{"id": "a70f9e4e-ccf1-4a44-aeea-89644226015c", "address": "fa:16:3e:7d:04:ee", "network": {"id": "f525a1b6-9f81-4d75-8d90-d86d731ac984", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-765077462-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3c6e9283a81a4cc197709fd070c13c34", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "255460d5-71d4-4bfd-87f1-acc10085db7f", "external-id": "nsx-vlan-transportzone-152", "segmentation_id": 152, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa70f9e4e-cc", "ovs_interfaceid": "a70f9e4e-ccf1-4a44-aeea-89644226015c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 962.538028] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Releasing lock "refresh_cache-585e3015-faef-40df-b3dd-04d2c8e4dd00" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 962.538028] env[60548]: DEBUG nova.compute.manager [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Instance network_info: |[{"id": "a70f9e4e-ccf1-4a44-aeea-89644226015c", "address": "fa:16:3e:7d:04:ee", "network": {"id": "f525a1b6-9f81-4d75-8d90-d86d731ac984", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-765077462-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3c6e9283a81a4cc197709fd070c13c34", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "255460d5-71d4-4bfd-87f1-acc10085db7f", "external-id": "nsx-vlan-transportzone-152", "segmentation_id": 152, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa70f9e4e-cc", "ovs_interfaceid": "a70f9e4e-ccf1-4a44-aeea-89644226015c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 962.538259] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7d:04:ee', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '255460d5-71d4-4bfd-87f1-acc10085db7f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a70f9e4e-ccf1-4a44-aeea-89644226015c', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 962.547739] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Creating folder: Project (3c6e9283a81a4cc197709fd070c13c34). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 962.548881] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6fd3ed4e-eecd-480a-a5c9-01709daac470 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.571067] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Created folder: Project (3c6e9283a81a4cc197709fd070c13c34) in parent group-v850287. [ 962.571411] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Creating folder: Instances. Parent ref: group-v850346. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 962.572546] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2c68c130-46b4-41c3-a2de-979df8802240 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.583164] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Created folder: Instances in parent group-v850346. [ 962.583429] env[60548]: DEBUG oslo.service.loopingcall [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 962.583621] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 962.583821] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1a44a2da-25ed-40c1-b3ad-c0f87a1baf64 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.610792] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 962.610792] env[60548]: value = "task-4323388" [ 962.610792] env[60548]: _type = "Task" [ 962.610792] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 962.626718] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323388, 'name': CreateVM_Task} progress is 5%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 962.747355] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6aa40bd2-a3a4-4a2d-83a8-f0d68c5a6035 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.756081] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a740444d-bb28-4323-9d7c-3857ba76b319 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.790621] env[60548]: DEBUG nova.network.neutron [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Successfully created port: 1c9d862d-8a0d-4c91-834b-72d52fd1fcb7 {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 962.793099] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2afccb09-16fa-4da2-ab64-31d9ae3557f2 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.803371] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 962.804033] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Creating directory with path [datastore1] vmware_temp/410ffba7-7c9a-4c2c-b693-e3545b2577a7/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 962.804033] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-49ccf9fe-7c99-4286-9dc8-8f882368d606 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.809144] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7605867b-c903-48aa-a003-a0e4d9eae4f7 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.824156] env[60548]: DEBUG nova.compute.provider_tree [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 962.826962] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Created directory with path [datastore1] vmware_temp/410ffba7-7c9a-4c2c-b693-e3545b2577a7/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 962.827185] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Fetch image to [datastore1] vmware_temp/410ffba7-7c9a-4c2c-b693-e3545b2577a7/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 962.827353] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/410ffba7-7c9a-4c2c-b693-e3545b2577a7/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 962.828420] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-def93416-e14d-4690-b4d1-fff9ff54c079 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.837682] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a151b89d-db9e-4599-82e5-b43d6cc9d200 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.842494] env[60548]: DEBUG nova.scheduler.client.report [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 962.856024] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2186221-83e2-4cda-ba17-8e424970af4a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.866965] env[60548]: DEBUG oslo_vmware.api [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Task: {'id': task-4323385, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082521} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 962.866965] env[60548]: DEBUG oslo_concurrency.lockutils [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.367s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 962.867158] env[60548]: DEBUG nova.compute.manager [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 962.869760] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 962.869933] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 962.870239] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 962.870441] env[60548]: INFO nova.compute.manager [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Took 0.64 seconds to destroy the instance on the hypervisor. [ 962.898316] env[60548]: DEBUG nova.compute.claims [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 962.898316] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 962.898316] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 962.902789] env[60548]: DEBUG nova.network.neutron [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Successfully updated port: 0fecdaa6-f624-45a1-b74d-c1443e12532d {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 962.904473] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a8df653-6dfc-462d-af52-a7659a9b9bc3 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.915028] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bcac59c4-0111-472d-abc1-a17c1c1dcd36 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.918118] env[60548]: DEBUG nova.compute.utils [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 962.919855] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquiring lock "refresh_cache-e3fd811a-186d-436f-bdef-a910a3ccd416" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 962.919941] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquired lock "refresh_cache-e3fd811a-186d-436f-bdef-a910a3ccd416" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 962.920282] env[60548]: DEBUG nova.network.neutron [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 962.922558] env[60548]: DEBUG nova.compute.manager [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 962.922759] env[60548]: DEBUG nova.network.neutron [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 962.935612] env[60548]: DEBUG nova.compute.manager [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 962.951159] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 962.956333] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.058s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 962.957083] env[60548]: DEBUG nova.compute.utils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Instance 306f3cb9-3028-4ff2-8090-2c9c1c72efc1 could not be found. {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 962.958773] env[60548]: DEBUG nova.compute.manager [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Instance disappeared during build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 962.958773] env[60548]: DEBUG nova.compute.manager [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 962.959074] env[60548]: DEBUG nova.compute.manager [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 962.959279] env[60548]: DEBUG nova.compute.manager [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 962.959329] env[60548]: DEBUG nova.network.neutron [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 963.019480] env[60548]: DEBUG nova.network.neutron [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 963.023545] env[60548]: DEBUG nova.compute.manager [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 963.076031] env[60548]: DEBUG nova.virt.hardware [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 963.076299] env[60548]: DEBUG nova.virt.hardware [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 963.076360] env[60548]: DEBUG nova.virt.hardware [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 963.076534] env[60548]: DEBUG nova.virt.hardware [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 963.076683] env[60548]: DEBUG nova.virt.hardware [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 963.076837] env[60548]: DEBUG nova.virt.hardware [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 963.077054] env[60548]: DEBUG nova.virt.hardware [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 963.077218] env[60548]: DEBUG nova.virt.hardware [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 963.077385] env[60548]: DEBUG nova.virt.hardware [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 963.077546] env[60548]: DEBUG nova.virt.hardware [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 963.077716] env[60548]: DEBUG nova.virt.hardware [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 963.078684] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a2da8e3-bdf7-45d2-ac3a-005d3a7995e3 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.082544] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 963.083308] env[60548]: ERROR nova.compute.manager [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Traceback (most recent call last): [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] result = getattr(controller, method)(*args, **kwargs) [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return self._get(image_id) [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return RequestIdProxy(wrapped(*args, **kwargs)) [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] resp, body = self.http_client.get(url, headers=header) [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return self.request(url, 'GET', **kwargs) [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return self._handle_response(resp) [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] raise exc.from_response(resp, resp.content) [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] During handling of the above exception, another exception occurred: [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Traceback (most recent call last): [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] yield resources [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] self.driver.spawn(context, instance, image_meta, [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] self._vmops.spawn(context, instance, image_meta, injected_files, [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] self._fetch_image_if_missing(context, vi) [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] image_fetch(context, vi, tmp_image_ds_loc) [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] images.fetch_image( [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 963.083308] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] metadata = IMAGE_API.get(context, image_ref) [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return session.show(context, image_id, [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] _reraise_translated_image_exception(image_id) [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] raise new_exc.with_traceback(exc_trace) [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] result = getattr(controller, method)(*args, **kwargs) [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return self._get(image_id) [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return RequestIdProxy(wrapped(*args, **kwargs)) [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] resp, body = self.http_client.get(url, headers=header) [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return self.request(url, 'GET', **kwargs) [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return self._handle_response(resp) [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] raise exc.from_response(resp, resp.content) [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 963.084409] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 963.084409] env[60548]: INFO nova.compute.manager [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Terminating instance [ 963.085571] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 963.085756] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 963.086430] env[60548]: DEBUG nova.compute.manager [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 963.086615] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 963.087247] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e2a4b2e2-5632-4223-9ed7-00e27b983ab5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.089937] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3eda835c-4510-4987-a392-f1155cc2a20a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.097359] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb22dc18-307b-44ef-9dce-0647a223dde6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.104630] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 963.106171] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-11c2ed0b-88f8-4188-9f2c-ee4045d9b5d5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.107883] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 963.108082] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 963.118034] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9f022219-e4ee-4707-9457-8f642bb40c4e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.129423] env[60548]: DEBUG oslo_vmware.api [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Waiting for the task: (returnval){ [ 963.129423] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]5212fb5d-ffe8-a135-ddbb-df6ab2b1eda9" [ 963.129423] env[60548]: _type = "Task" [ 963.129423] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 963.132904] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323388, 'name': CreateVM_Task, 'duration_secs': 0.33467} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 963.136657] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 963.137493] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 963.137590] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 963.137979] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 963.138669] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bfd2740c-faa6-4408-b293-5b46449a5507 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.146399] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 963.146647] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Creating directory with path [datastore1] vmware_temp/d3a27b41-b37f-48f2-9d51-07ae56a99eb1/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 963.146969] env[60548]: DEBUG oslo_vmware.api [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Waiting for the task: (returnval){ [ 963.146969] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]5233bdf2-cb88-26e2-6c43-cb2d817c247f" [ 963.146969] env[60548]: _type = "Task" [ 963.146969] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 963.147197] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-655f43d0-3d0a-4b66-a9bb-c6c7f666f25c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.159528] env[60548]: DEBUG oslo_vmware.api [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]5233bdf2-cb88-26e2-6c43-cb2d817c247f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 963.178115] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Created directory with path [datastore1] vmware_temp/d3a27b41-b37f-48f2-9d51-07ae56a99eb1/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 963.178377] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Fetch image to [datastore1] vmware_temp/d3a27b41-b37f-48f2-9d51-07ae56a99eb1/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 963.178543] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/d3a27b41-b37f-48f2-9d51-07ae56a99eb1/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 963.179425] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b721dbb1-5b12-4e33-aa2b-e2571e716fe4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.188660] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bb50c83-6b29-4c85-a819-3e94e6b59a6d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.203726] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c26349b-1fc1-41b7-9a1d-fb6b8cb81159 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.208438] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 963.208700] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 963.209058] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Deleting the datastore file [datastore1] 46737200-2da8-41ee-b33e-3bb6cc3e4618 {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 963.209250] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0ab5125a-93ca-4aac-a0b2-8f20a60dc370 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.251587] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eed78ec2-acb2-4fc6-8284-d6f92adf6a74 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.255586] env[60548]: DEBUG oslo_vmware.api [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Waiting for the task: (returnval){ [ 963.255586] env[60548]: value = "task-4323390" [ 963.255586] env[60548]: _type = "Task" [ 963.255586] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 963.264558] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5aa14935-1f06-4859-800f-37768fd108f6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.271171] env[60548]: DEBUG oslo_vmware.api [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Task: {'id': task-4323390, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 963.271643] env[60548]: DEBUG nova.policy [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '00dcf60eaad04b91b049aeb30a4f75fc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4ff13b85e9c649c4911ee029ff0304f4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 963.274048] env[60548]: DEBUG nova.network.neutron [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Updating instance_info_cache with network_info: [{"id": "59582163-0304-4f82-9a45-39db58047179", "address": "fa:16:3e:e2:38:39", "network": {"id": "3078774a-2da5-4c84-b8f9-2cdd08b17c94", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1829706988-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e5343838b9d64cf0aeb72c368f89eea7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap59582163-03", "ovs_interfaceid": "59582163-0304-4f82-9a45-39db58047179", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 963.289159] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Releasing lock "refresh_cache-ad98988d-92aa-4ace-8e40-cd316758002e" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 963.289647] env[60548]: DEBUG nova.compute.manager [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Instance network_info: |[{"id": "59582163-0304-4f82-9a45-39db58047179", "address": "fa:16:3e:e2:38:39", "network": {"id": "3078774a-2da5-4c84-b8f9-2cdd08b17c94", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1829706988-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e5343838b9d64cf0aeb72c368f89eea7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap59582163-03", "ovs_interfaceid": "59582163-0304-4f82-9a45-39db58047179", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 963.290303] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e2:38:39', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ad72c645-a67d-4efd-b563-28e44077e68d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '59582163-0304-4f82-9a45-39db58047179', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 963.298762] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Creating folder: Project (e5343838b9d64cf0aeb72c368f89eea7). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 963.302690] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ecd95764-fc65-413f-a0a3-2d1543601b3e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.304844] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 963.317924] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Created folder: Project (e5343838b9d64cf0aeb72c368f89eea7) in parent group-v850287. [ 963.318165] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Creating folder: Instances. Parent ref: group-v850349. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 963.319025] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8d2d422b-16f4-4530-b26e-e791f0911029 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.328476] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Created folder: Instances in parent group-v850349. [ 963.328722] env[60548]: DEBUG oslo.service.loopingcall [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 963.328929] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 963.329208] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f0849609-4d32-4ba2-af33-7d4f1997d28e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.350979] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 963.350979] env[60548]: value = "task-4323393" [ 963.350979] env[60548]: _type = "Task" [ 963.350979] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 963.360111] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323393, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 963.379536] env[60548]: DEBUG nova.compute.manager [req-b88e6a5b-b24c-4eff-accc-a208ef7b070e req-689f66ad-ccfb-49d4-8eee-6d500d620f39 service nova] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Received event network-vif-plugged-59582163-0304-4f82-9a45-39db58047179 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 963.379833] env[60548]: DEBUG oslo_concurrency.lockutils [req-b88e6a5b-b24c-4eff-accc-a208ef7b070e req-689f66ad-ccfb-49d4-8eee-6d500d620f39 service nova] Acquiring lock "ad98988d-92aa-4ace-8e40-cd316758002e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 963.380173] env[60548]: DEBUG oslo_concurrency.lockutils [req-b88e6a5b-b24c-4eff-accc-a208ef7b070e req-689f66ad-ccfb-49d4-8eee-6d500d620f39 service nova] Lock "ad98988d-92aa-4ace-8e40-cd316758002e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 963.380587] env[60548]: DEBUG oslo_concurrency.lockutils [req-b88e6a5b-b24c-4eff-accc-a208ef7b070e req-689f66ad-ccfb-49d4-8eee-6d500d620f39 service nova] Lock "ad98988d-92aa-4ace-8e40-cd316758002e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 963.380587] env[60548]: DEBUG nova.compute.manager [req-b88e6a5b-b24c-4eff-accc-a208ef7b070e req-689f66ad-ccfb-49d4-8eee-6d500d620f39 service nova] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] No waiting events found dispatching network-vif-plugged-59582163-0304-4f82-9a45-39db58047179 {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 963.380705] env[60548]: WARNING nova.compute.manager [req-b88e6a5b-b24c-4eff-accc-a208ef7b070e req-689f66ad-ccfb-49d4-8eee-6d500d620f39 service nova] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Received unexpected event network-vif-plugged-59582163-0304-4f82-9a45-39db58047179 for instance with vm_state building and task_state spawning. [ 963.380870] env[60548]: DEBUG nova.compute.manager [req-b88e6a5b-b24c-4eff-accc-a208ef7b070e req-689f66ad-ccfb-49d4-8eee-6d500d620f39 service nova] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Received event network-changed-59582163-0304-4f82-9a45-39db58047179 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 963.381022] env[60548]: DEBUG nova.compute.manager [req-b88e6a5b-b24c-4eff-accc-a208ef7b070e req-689f66ad-ccfb-49d4-8eee-6d500d620f39 service nova] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Refreshing instance network info cache due to event network-changed-59582163-0304-4f82-9a45-39db58047179. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 963.381168] env[60548]: DEBUG oslo_concurrency.lockutils [req-b88e6a5b-b24c-4eff-accc-a208ef7b070e req-689f66ad-ccfb-49d4-8eee-6d500d620f39 service nova] Acquiring lock "refresh_cache-ad98988d-92aa-4ace-8e40-cd316758002e" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 963.381368] env[60548]: DEBUG oslo_concurrency.lockutils [req-b88e6a5b-b24c-4eff-accc-a208ef7b070e req-689f66ad-ccfb-49d4-8eee-6d500d620f39 service nova] Acquired lock "refresh_cache-ad98988d-92aa-4ace-8e40-cd316758002e" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 963.381481] env[60548]: DEBUG nova.network.neutron [req-b88e6a5b-b24c-4eff-accc-a208ef7b070e req-689f66ad-ccfb-49d4-8eee-6d500d620f39 service nova] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Refreshing network info cache for port 59582163-0304-4f82-9a45-39db58047179 {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 963.459053] env[60548]: DEBUG neutronclient.v2_0.client [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60548) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 963.460666] env[60548]: ERROR nova.compute.manager [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Traceback (most recent call last): [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] result = getattr(controller, method)(*args, **kwargs) [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return self._get(image_id) [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return RequestIdProxy(wrapped(*args, **kwargs)) [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] resp, body = self.http_client.get(url, headers=header) [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return self.request(url, 'GET', **kwargs) [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return self._handle_response(resp) [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] raise exc.from_response(resp, resp.content) [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] During handling of the above exception, another exception occurred: [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Traceback (most recent call last): [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] self.driver.spawn(context, instance, image_meta, [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] self._fetch_image_if_missing(context, vi) [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] image_fetch(context, vi, tmp_image_ds_loc) [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] images.fetch_image( [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] metadata = IMAGE_API.get(context, image_ref) [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return session.show(context, image_id, [ 963.460666] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] _reraise_translated_image_exception(image_id) [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] raise new_exc.with_traceback(exc_trace) [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] result = getattr(controller, method)(*args, **kwargs) [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return self._get(image_id) [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return RequestIdProxy(wrapped(*args, **kwargs)) [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] resp, body = self.http_client.get(url, headers=header) [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return self.request(url, 'GET', **kwargs) [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return self._handle_response(resp) [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] raise exc.from_response(resp, resp.content) [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] During handling of the above exception, another exception occurred: [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Traceback (most recent call last): [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] self._build_and_run_instance(context, instance, image, [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] with excutils.save_and_reraise_exception(): [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] self.force_reraise() [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] raise self.value [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] with self.rt.instance_claim(context, instance, node, allocs, [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] self.abort() [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 963.462335] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return f(*args, **kwargs) [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] self._unset_instance_host_and_node(instance) [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] instance.save() [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] updates, result = self.indirection_api.object_action( [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return cctxt.call(context, 'object_action', objinst=objinst, [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] result = self.transport._send( [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return self._driver.send(target, ctxt, message, [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] raise result [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] nova.exception_Remote.InstanceNotFound_Remote: Instance 306f3cb9-3028-4ff2-8090-2c9c1c72efc1 could not be found. [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Traceback (most recent call last): [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return getattr(target, method)(*args, **kwargs) [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return fn(self, *args, **kwargs) [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] old_ref, inst_ref = db.instance_update_and_get_original( [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return f(*args, **kwargs) [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] with excutils.save_and_reraise_exception() as ectxt: [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] self.force_reraise() [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] raise self.value [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return f(*args, **kwargs) [ 963.464049] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return f(context, *args, **kwargs) [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] raise exception.InstanceNotFound(instance_id=uuid) [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] nova.exception.InstanceNotFound: Instance 306f3cb9-3028-4ff2-8090-2c9c1c72efc1 could not be found. [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] During handling of the above exception, another exception occurred: [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Traceback (most recent call last): [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] ret = obj(*args, **kwargs) [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] exception_handler_v20(status_code, error_body) [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] raise client_exc(message=error_message, [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Neutron server returns request_ids: ['req-cf9773f8-6be2-43fb-ae75-5604a57e0e8f'] [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] During handling of the above exception, another exception occurred: [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Traceback (most recent call last): [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] self._deallocate_network(context, instance, requested_networks) [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] self.network_api.deallocate_for_instance( [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] data = neutron.list_ports(**search_opts) [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] ret = obj(*args, **kwargs) [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return self.list('ports', self.ports_path, retrieve_all, [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] ret = obj(*args, **kwargs) [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 963.465620] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] for r in self._pagination(collection, path, **params): [ 963.467524] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 963.467524] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] res = self.get(path, params=params) [ 963.467524] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 963.467524] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] ret = obj(*args, **kwargs) [ 963.467524] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 963.467524] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return self.retry_request("GET", action, body=body, [ 963.467524] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 963.467524] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] ret = obj(*args, **kwargs) [ 963.467524] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 963.467524] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] return self.do_request(method, action, body=body, [ 963.467524] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 963.467524] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] ret = obj(*args, **kwargs) [ 963.467524] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 963.467524] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] self._handle_fault_response(status_code, replybody, resp) [ 963.467524] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 963.467524] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] raise exception.Unauthorized() [ 963.467524] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] nova.exception.Unauthorized: Not authorized. [ 963.467524] env[60548]: ERROR nova.compute.manager [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] [ 963.491069] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ea692a03-b680-4082-a4d3-36b28076ef4d tempest-ListImageFiltersTestJSON-1294019059 tempest-ListImageFiltersTestJSON-1294019059-project-member] Lock "306f3cb9-3028-4ff2-8090-2c9c1c72efc1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 306.749s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 963.506118] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 963.507080] env[60548]: ERROR nova.compute.manager [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Traceback (most recent call last): [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] result = getattr(controller, method)(*args, **kwargs) [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return self._get(image_id) [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return RequestIdProxy(wrapped(*args, **kwargs)) [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] resp, body = self.http_client.get(url, headers=header) [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return self.request(url, 'GET', **kwargs) [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return self._handle_response(resp) [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] raise exc.from_response(resp, resp.content) [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] During handling of the above exception, another exception occurred: [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Traceback (most recent call last): [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] yield resources [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] self.driver.spawn(context, instance, image_meta, [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] self._fetch_image_if_missing(context, vi) [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] image_fetch(context, vi, tmp_image_ds_loc) [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] images.fetch_image( [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 963.507080] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] metadata = IMAGE_API.get(context, image_ref) [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return session.show(context, image_id, [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] _reraise_translated_image_exception(image_id) [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] raise new_exc.with_traceback(exc_trace) [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] result = getattr(controller, method)(*args, **kwargs) [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return self._get(image_id) [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return RequestIdProxy(wrapped(*args, **kwargs)) [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] resp, body = self.http_client.get(url, headers=header) [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return self.request(url, 'GET', **kwargs) [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return self._handle_response(resp) [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] raise exc.from_response(resp, resp.content) [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 963.508140] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 963.508140] env[60548]: INFO nova.compute.manager [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Terminating instance [ 963.509497] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 963.509758] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 963.510637] env[60548]: DEBUG nova.compute.manager [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 963.510835] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 963.511107] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1b9125ea-b8e1-4bd7-9440-7b2f1d1cca3e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.514684] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2edc40c1-df94-4a8e-ba2c-ad9a9d328b4c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.518601] env[60548]: DEBUG nova.compute.manager [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 963.527066] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 963.527355] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2e35d7bc-cc4b-450a-938e-90073a6fe118 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.530383] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 963.530542] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 963.531985] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b9c9ad5b-453b-44e9-ba97-b5807052b45c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.539184] env[60548]: DEBUG oslo_vmware.api [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Waiting for the task: (returnval){ [ 963.539184] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52f4bbc4-5d94-e1d3-fab6-44795c883442" [ 963.539184] env[60548]: _type = "Task" [ 963.539184] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 963.563632] env[60548]: DEBUG oslo_vmware.api [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52f4bbc4-5d94-e1d3-fab6-44795c883442, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 963.588345] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 963.588607] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 963.590695] env[60548]: INFO nova.compute.claims [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 963.659837] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 963.660134] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 963.660358] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 963.714026] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 963.714482] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 963.714623] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Deleting the datastore file [datastore1] 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2 {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 963.714897] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9435d392-1141-427a-a44e-7e922e683d1c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.723658] env[60548]: DEBUG oslo_vmware.api [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Waiting for the task: (returnval){ [ 963.723658] env[60548]: value = "task-4323395" [ 963.723658] env[60548]: _type = "Task" [ 963.723658] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 963.737085] env[60548]: DEBUG oslo_vmware.api [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Task: {'id': task-4323395, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 963.766994] env[60548]: DEBUG oslo_vmware.api [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Task: {'id': task-4323390, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.08738} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 963.767271] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 963.767451] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 963.767623] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 963.768357] env[60548]: INFO nova.compute.manager [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Took 0.68 seconds to destroy the instance on the hypervisor. [ 963.770080] env[60548]: DEBUG nova.compute.claims [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 963.770262] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 963.808101] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e7b8824-a815-42be-8ae1-7660fd200507 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.816364] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9da3a02-fd5b-42fd-8cb2-6dc7bdc4c66c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.848797] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b39a1d42-1e14-4459-811f-0cbb17c555cb {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.852583] env[60548]: DEBUG nova.network.neutron [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Updating instance_info_cache with network_info: [{"id": "0fecdaa6-f624-45a1-b74d-c1443e12532d", "address": "fa:16:3e:20:ed:18", "network": {"id": "d5f86510-a301-4081-92ea-e8d3f51eda39", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1500644825-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e712a58321b3496d851d463771623d15", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a9abd00f-2cea-40f8-9804-a56b6431192d", "external-id": "nsx-vlan-transportzone-639", "segmentation_id": 639, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0fecdaa6-f6", "ovs_interfaceid": "0fecdaa6-f624-45a1-b74d-c1443e12532d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 963.864440] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323393, 'name': CreateVM_Task, 'duration_secs': 0.437443} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 963.865605] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45c46139-09ce-48ca-909a-2a209bcc0979 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.869720] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 963.871106] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 963.871175] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 963.871480] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 963.871901] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Releasing lock "refresh_cache-e3fd811a-186d-436f-bdef-a910a3ccd416" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 963.872307] env[60548]: DEBUG nova.compute.manager [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Instance network_info: |[{"id": "0fecdaa6-f624-45a1-b74d-c1443e12532d", "address": "fa:16:3e:20:ed:18", "network": {"id": "d5f86510-a301-4081-92ea-e8d3f51eda39", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1500644825-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e712a58321b3496d851d463771623d15", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a9abd00f-2cea-40f8-9804-a56b6431192d", "external-id": "nsx-vlan-transportzone-639", "segmentation_id": 639, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0fecdaa6-f6", "ovs_interfaceid": "0fecdaa6-f624-45a1-b74d-c1443e12532d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 963.873072] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-739c0106-e559-468e-b0f3-35de8b30467d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.874944] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:20:ed:18', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a9abd00f-2cea-40f8-9804-a56b6431192d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0fecdaa6-f624-45a1-b74d-c1443e12532d', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 963.883125] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Creating folder: Project (e712a58321b3496d851d463771623d15). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 963.893714] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f1e1dbed-f5ac-46db-9abc-26bf7e35a632 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.895716] env[60548]: DEBUG nova.compute.provider_tree [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 963.900624] env[60548]: DEBUG oslo_vmware.api [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Waiting for the task: (returnval){ [ 963.900624] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52601572-1578-3d92-666b-c3df35f311cb" [ 963.900624] env[60548]: _type = "Task" [ 963.900624] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 963.907988] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Created folder: Project (e712a58321b3496d851d463771623d15) in parent group-v850287. [ 963.908250] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Creating folder: Instances. Parent ref: group-v850352. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 963.912473] env[60548]: DEBUG nova.scheduler.client.report [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 963.915411] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-47697fc1-79d9-4f07-ac02-01a5e461f97b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.917550] env[60548]: DEBUG oslo_vmware.api [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52601572-1578-3d92-666b-c3df35f311cb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 963.928672] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Created folder: Instances in parent group-v850352. [ 963.928944] env[60548]: DEBUG oslo.service.loopingcall [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 963.929276] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 963.929428] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-03361225-a8d6-4800-8879-04715c97cd7b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 963.944803] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.356s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 963.945312] env[60548]: DEBUG nova.compute.manager [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 963.948284] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.178s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 963.954601] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 963.954601] env[60548]: value = "task-4323398" [ 963.954601] env[60548]: _type = "Task" [ 963.954601] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 963.962791] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323398, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 963.979497] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.031s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 963.980483] env[60548]: DEBUG nova.compute.utils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Instance 46737200-2da8-41ee-b33e-3bb6cc3e4618 could not be found. {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 963.981802] env[60548]: DEBUG nova.compute.manager [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Instance disappeared during build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 963.981919] env[60548]: DEBUG nova.compute.manager [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 963.982092] env[60548]: DEBUG nova.compute.manager [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 963.982299] env[60548]: DEBUG nova.compute.manager [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 963.982416] env[60548]: DEBUG nova.network.neutron [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 964.002768] env[60548]: DEBUG nova.compute.utils [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 964.005788] env[60548]: DEBUG nova.compute.manager [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 964.006029] env[60548]: DEBUG nova.network.neutron [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 964.022863] env[60548]: DEBUG nova.compute.manager [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 964.032506] env[60548]: DEBUG nova.compute.manager [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Received event network-changed-a70f9e4e-ccf1-4a44-aeea-89644226015c {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 964.032506] env[60548]: DEBUG nova.compute.manager [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Refreshing instance network info cache due to event network-changed-a70f9e4e-ccf1-4a44-aeea-89644226015c. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 964.032506] env[60548]: DEBUG oslo_concurrency.lockutils [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] Acquiring lock "refresh_cache-585e3015-faef-40df-b3dd-04d2c8e4dd00" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 964.032506] env[60548]: DEBUG oslo_concurrency.lockutils [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] Acquired lock "refresh_cache-585e3015-faef-40df-b3dd-04d2c8e4dd00" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 964.032506] env[60548]: DEBUG nova.network.neutron [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Refreshing network info cache for port a70f9e4e-ccf1-4a44-aeea-89644226015c {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 964.055747] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 964.056074] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Creating directory with path [datastore1] vmware_temp/ec2ae176-f15e-432a-a067-d974c29ad327/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 964.056323] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f547c53a-9c05-47ba-b7e3-7bb9c790401b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.074254] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Created directory with path [datastore1] vmware_temp/ec2ae176-f15e-432a-a067-d974c29ad327/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 964.074254] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Fetch image to [datastore1] vmware_temp/ec2ae176-f15e-432a-a067-d974c29ad327/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 964.074438] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/ec2ae176-f15e-432a-a067-d974c29ad327/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 964.075680] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b453cc0-c897-41bf-b293-5b35304a4962 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.088550] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04de607e-d762-4121-8df9-149f2cb1a05c {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.103394] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cca2a978-63c4-42e7-9699-71eff59b01c4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.111033] env[60548]: DEBUG nova.compute.manager [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 964.144047] env[60548]: DEBUG nova.network.neutron [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Successfully created port: c95ecd04-2d9a-4470-a5bf-7b4eebae8e2b {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 964.146971] env[60548]: DEBUG nova.network.neutron [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Successfully created port: 6ac0f8a3-c5bf-4a38-8a33-9cafae07546f {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 964.149770] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af5fee0f-70df-4251-837d-0b5a17bb3927 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.159251] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7831fa9f-04bc-4604-b868-c8803762a076 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.188418] env[60548]: DEBUG nova.policy [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1937efa54ca4481f8260913e983bb134', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4c0d2d521309468ba27acf66751f02f7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 964.195689] env[60548]: DEBUG nova.virt.hardware [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 964.196027] env[60548]: DEBUG nova.virt.hardware [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 964.196261] env[60548]: DEBUG nova.virt.hardware [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 964.196645] env[60548]: DEBUG nova.virt.hardware [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 964.196645] env[60548]: DEBUG nova.virt.hardware [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 964.196824] env[60548]: DEBUG nova.virt.hardware [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 964.197067] env[60548]: DEBUG nova.virt.hardware [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 964.197260] env[60548]: DEBUG nova.virt.hardware [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 964.197477] env[60548]: DEBUG nova.virt.hardware [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 964.197654] env[60548]: DEBUG nova.virt.hardware [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 964.197928] env[60548]: DEBUG nova.virt.hardware [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 964.199395] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 964.201893] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6dc12b6-7394-4066-a7a6-cbc212625ca7 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.205851] env[60548]: DEBUG neutronclient.v2_0.client [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60548) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 964.207261] env[60548]: ERROR nova.compute.manager [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Traceback (most recent call last): [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] result = getattr(controller, method)(*args, **kwargs) [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return self._get(image_id) [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return RequestIdProxy(wrapped(*args, **kwargs)) [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] resp, body = self.http_client.get(url, headers=header) [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return self.request(url, 'GET', **kwargs) [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return self._handle_response(resp) [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] raise exc.from_response(resp, resp.content) [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] During handling of the above exception, another exception occurred: [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Traceback (most recent call last): [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] self.driver.spawn(context, instance, image_meta, [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] self._vmops.spawn(context, instance, image_meta, injected_files, [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] self._fetch_image_if_missing(context, vi) [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] image_fetch(context, vi, tmp_image_ds_loc) [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] images.fetch_image( [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] metadata = IMAGE_API.get(context, image_ref) [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return session.show(context, image_id, [ 964.207261] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] _reraise_translated_image_exception(image_id) [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] raise new_exc.with_traceback(exc_trace) [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] result = getattr(controller, method)(*args, **kwargs) [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return self._get(image_id) [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return RequestIdProxy(wrapped(*args, **kwargs)) [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] resp, body = self.http_client.get(url, headers=header) [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return self.request(url, 'GET', **kwargs) [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return self._handle_response(resp) [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] raise exc.from_response(resp, resp.content) [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] During handling of the above exception, another exception occurred: [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Traceback (most recent call last): [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] self._build_and_run_instance(context, instance, image, [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] with excutils.save_and_reraise_exception(): [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] self.force_reraise() [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] raise self.value [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] with self.rt.instance_claim(context, instance, node, allocs, [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] self.abort() [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 964.208339] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return f(*args, **kwargs) [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] self._unset_instance_host_and_node(instance) [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] instance.save() [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] updates, result = self.indirection_api.object_action( [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return cctxt.call(context, 'object_action', objinst=objinst, [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] result = self.transport._send( [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return self._driver.send(target, ctxt, message, [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] raise result [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] nova.exception_Remote.InstanceNotFound_Remote: Instance 46737200-2da8-41ee-b33e-3bb6cc3e4618 could not be found. [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Traceback (most recent call last): [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return getattr(target, method)(*args, **kwargs) [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return fn(self, *args, **kwargs) [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] old_ref, inst_ref = db.instance_update_and_get_original( [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return f(*args, **kwargs) [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] with excutils.save_and_reraise_exception() as ectxt: [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] self.force_reraise() [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] raise self.value [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return f(*args, **kwargs) [ 964.209893] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return f(context, *args, **kwargs) [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] raise exception.InstanceNotFound(instance_id=uuid) [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] nova.exception.InstanceNotFound: Instance 46737200-2da8-41ee-b33e-3bb6cc3e4618 could not be found. [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] During handling of the above exception, another exception occurred: [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Traceback (most recent call last): [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] ret = obj(*args, **kwargs) [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] exception_handler_v20(status_code, error_body) [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] raise client_exc(message=error_message, [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Neutron server returns request_ids: ['req-b08a05dc-6256-4773-a526-ca16dafc815e'] [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] During handling of the above exception, another exception occurred: [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Traceback (most recent call last): [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] self._deallocate_network(context, instance, requested_networks) [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] self.network_api.deallocate_for_instance( [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] data = neutron.list_ports(**search_opts) [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] ret = obj(*args, **kwargs) [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return self.list('ports', self.ports_path, retrieve_all, [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] ret = obj(*args, **kwargs) [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 964.214438] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] for r in self._pagination(collection, path, **params): [ 964.215530] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 964.215530] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] res = self.get(path, params=params) [ 964.215530] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 964.215530] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] ret = obj(*args, **kwargs) [ 964.215530] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 964.215530] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return self.retry_request("GET", action, body=body, [ 964.215530] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 964.215530] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] ret = obj(*args, **kwargs) [ 964.215530] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 964.215530] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] return self.do_request(method, action, body=body, [ 964.215530] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 964.215530] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] ret = obj(*args, **kwargs) [ 964.215530] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 964.215530] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] self._handle_fault_response(status_code, replybody, resp) [ 964.215530] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 964.215530] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] raise exception.Unauthorized() [ 964.215530] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] nova.exception.Unauthorized: Not authorized. [ 964.215530] env[60548]: ERROR nova.compute.manager [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] [ 964.219844] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed2c0c17-f44b-4493-88d9-65e80f96430f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.263156] env[60548]: DEBUG oslo_vmware.api [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Task: {'id': task-4323395, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079976} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 964.263156] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 964.263289] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 964.263433] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 964.263595] env[60548]: INFO nova.compute.manager [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Took 0.75 seconds to destroy the instance on the hypervisor. [ 964.266338] env[60548]: DEBUG nova.compute.claims [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 964.266533] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 964.266751] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 964.271259] env[60548]: DEBUG oslo_concurrency.lockutils [None req-0cce2017-52b4-498d-9162-f2d06a7fc921 tempest-DeleteServersTestJSON-455906324 tempest-DeleteServersTestJSON-455906324-project-member] Lock "46737200-2da8-41ee-b33e-3bb6cc3e4618" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 307.011s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 964.289147] env[60548]: DEBUG nova.compute.manager [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 964.309604] env[60548]: DEBUG oslo_vmware.rw_handles [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ec2ae176-f15e-432a-a067-d974c29ad327/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 964.372313] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.105s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 964.372480] env[60548]: DEBUG nova.compute.utils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Instance 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2 could not be found. {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 964.376469] env[60548]: DEBUG nova.compute.manager [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Instance disappeared during build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 964.376579] env[60548]: DEBUG nova.compute.manager [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 964.377100] env[60548]: DEBUG nova.compute.manager [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 964.377100] env[60548]: DEBUG nova.compute.manager [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 964.377100] env[60548]: DEBUG nova.network.neutron [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 964.382262] env[60548]: DEBUG oslo_vmware.rw_handles [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Completed reading data from the image iterator. {{(pid=60548) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 964.382262] env[60548]: DEBUG oslo_vmware.rw_handles [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ec2ae176-f15e-432a-a067-d974c29ad327/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 964.413428] env[60548]: DEBUG oslo_concurrency.lockutils [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 964.413685] env[60548]: DEBUG oslo_concurrency.lockutils [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 964.415413] env[60548]: INFO nova.compute.claims [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 964.419134] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 964.419134] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 964.419134] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 964.468361] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323398, 'name': CreateVM_Task, 'duration_secs': 0.336815} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 964.468756] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 964.469587] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 964.469587] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 964.469786] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 964.470757] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-85d0851e-f0cb-445b-905c-dcda56edc9ab {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.482462] env[60548]: DEBUG oslo_vmware.api [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Waiting for the task: (returnval){ [ 964.482462] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]5237fab5-a4b5-b127-a6ec-2d06c7c9c361" [ 964.482462] env[60548]: _type = "Task" [ 964.482462] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 964.494500] env[60548]: DEBUG oslo_vmware.api [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]5237fab5-a4b5-b127-a6ec-2d06c7c9c361, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 964.613475] env[60548]: DEBUG neutronclient.v2_0.client [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60548) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 964.615031] env[60548]: ERROR nova.compute.manager [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Traceback (most recent call last): [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] result = getattr(controller, method)(*args, **kwargs) [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return self._get(image_id) [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return RequestIdProxy(wrapped(*args, **kwargs)) [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] resp, body = self.http_client.get(url, headers=header) [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return self.request(url, 'GET', **kwargs) [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return self._handle_response(resp) [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] raise exc.from_response(resp, resp.content) [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] During handling of the above exception, another exception occurred: [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Traceback (most recent call last): [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] self.driver.spawn(context, instance, image_meta, [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] self._fetch_image_if_missing(context, vi) [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] image_fetch(context, vi, tmp_image_ds_loc) [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] images.fetch_image( [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] metadata = IMAGE_API.get(context, image_ref) [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return session.show(context, image_id, [ 964.615031] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] _reraise_translated_image_exception(image_id) [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] raise new_exc.with_traceback(exc_trace) [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] result = getattr(controller, method)(*args, **kwargs) [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return self._get(image_id) [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return RequestIdProxy(wrapped(*args, **kwargs)) [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] resp, body = self.http_client.get(url, headers=header) [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return self.request(url, 'GET', **kwargs) [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return self._handle_response(resp) [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] raise exc.from_response(resp, resp.content) [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] nova.exception.ImageNotAuthorized: Not authorized for image 5674e50f-0c0c-4f19-8379-104dac34660b. [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] During handling of the above exception, another exception occurred: [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Traceback (most recent call last): [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] self._build_and_run_instance(context, instance, image, [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] with excutils.save_and_reraise_exception(): [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] self.force_reraise() [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] raise self.value [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] with self.rt.instance_claim(context, instance, node, allocs, [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] self.abort() [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 964.616194] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return f(*args, **kwargs) [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] self._unset_instance_host_and_node(instance) [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] instance.save() [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] updates, result = self.indirection_api.object_action( [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return cctxt.call(context, 'object_action', objinst=objinst, [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] result = self.transport._send( [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return self._driver.send(target, ctxt, message, [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] raise result [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] nova.exception_Remote.InstanceNotFound_Remote: Instance 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2 could not be found. [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Traceback (most recent call last): [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return getattr(target, method)(*args, **kwargs) [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return fn(self, *args, **kwargs) [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] old_ref, inst_ref = db.instance_update_and_get_original( [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return f(*args, **kwargs) [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] with excutils.save_and_reraise_exception() as ectxt: [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] self.force_reraise() [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] raise self.value [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return f(*args, **kwargs) [ 964.617373] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return f(context, *args, **kwargs) [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] raise exception.InstanceNotFound(instance_id=uuid) [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] nova.exception.InstanceNotFound: Instance 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2 could not be found. [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] During handling of the above exception, another exception occurred: [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Traceback (most recent call last): [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] ret = obj(*args, **kwargs) [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] exception_handler_v20(status_code, error_body) [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] raise client_exc(message=error_message, [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Neutron server returns request_ids: ['req-3ab8e88a-a7c9-40f1-a4b0-8cd375e74bb6'] [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] During handling of the above exception, another exception occurred: [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Traceback (most recent call last): [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] self._deallocate_network(context, instance, requested_networks) [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] self.network_api.deallocate_for_instance( [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] data = neutron.list_ports(**search_opts) [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] ret = obj(*args, **kwargs) [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return self.list('ports', self.ports_path, retrieve_all, [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] ret = obj(*args, **kwargs) [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 964.618985] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] for r in self._pagination(collection, path, **params): [ 964.620283] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 964.620283] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] res = self.get(path, params=params) [ 964.620283] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 964.620283] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] ret = obj(*args, **kwargs) [ 964.620283] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 964.620283] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return self.retry_request("GET", action, body=body, [ 964.620283] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 964.620283] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] ret = obj(*args, **kwargs) [ 964.620283] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 964.620283] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] return self.do_request(method, action, body=body, [ 964.620283] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 964.620283] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] ret = obj(*args, **kwargs) [ 964.620283] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 964.620283] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] self._handle_fault_response(status_code, replybody, resp) [ 964.620283] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 964.620283] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] raise exception.Unauthorized() [ 964.620283] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] nova.exception.Unauthorized: Not authorized. [ 964.620283] env[60548]: ERROR nova.compute.manager [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] [ 964.647199] env[60548]: DEBUG oslo_concurrency.lockutils [None req-85a6e488-4ff9-4b22-885a-af1c2c6ff65c tempest-ImagesNegativeTestJSON-1588877586 tempest-ImagesNegativeTestJSON-1588877586-project-member] Lock "6774e2f5-99d0-4dc9-9ac0-188b35bd68a2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 299.159s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 964.663238] env[60548]: DEBUG nova.compute.manager [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Starting instance... {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 964.681624] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4079ff0e-3fd0-43fe-b728-854f41f492b4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.695275] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e20228f9-0555-4ff8-b486-0a03cc498185 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.732107] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a62e1f5a-10ee-4e29-a7d7-5d8a5b2231ee {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.746026] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d95cae2-0868-4a32-8ecc-982dbba596bb {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.759830] env[60548]: DEBUG nova.compute.provider_tree [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 964.761898] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 964.768374] env[60548]: DEBUG nova.scheduler.client.report [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 964.787922] env[60548]: DEBUG oslo_concurrency.lockutils [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.374s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 964.788475] env[60548]: DEBUG nova.compute.manager [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 964.791022] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.029s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 964.792503] env[60548]: INFO nova.compute.claims [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 964.820575] env[60548]: DEBUG nova.network.neutron [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Updated VIF entry in instance network info cache for port a70f9e4e-ccf1-4a44-aeea-89644226015c. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 964.821111] env[60548]: DEBUG nova.network.neutron [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Updating instance_info_cache with network_info: [{"id": "a70f9e4e-ccf1-4a44-aeea-89644226015c", "address": "fa:16:3e:7d:04:ee", "network": {"id": "f525a1b6-9f81-4d75-8d90-d86d731ac984", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-765077462-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3c6e9283a81a4cc197709fd070c13c34", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "255460d5-71d4-4bfd-87f1-acc10085db7f", "external-id": "nsx-vlan-transportzone-152", "segmentation_id": 152, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa70f9e4e-cc", "ovs_interfaceid": "a70f9e4e-ccf1-4a44-aeea-89644226015c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 964.825173] env[60548]: DEBUG nova.compute.utils [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 964.826551] env[60548]: DEBUG nova.compute.manager [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 964.826709] env[60548]: DEBUG nova.network.neutron [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 964.833368] env[60548]: DEBUG oslo_concurrency.lockutils [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] Releasing lock "refresh_cache-585e3015-faef-40df-b3dd-04d2c8e4dd00" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 964.833603] env[60548]: DEBUG nova.compute.manager [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Received event network-vif-plugged-0fecdaa6-f624-45a1-b74d-c1443e12532d {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 964.833782] env[60548]: DEBUG oslo_concurrency.lockutils [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] Acquiring lock "e3fd811a-186d-436f-bdef-a910a3ccd416-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 964.834431] env[60548]: DEBUG oslo_concurrency.lockutils [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] Lock "e3fd811a-186d-436f-bdef-a910a3ccd416-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 964.834431] env[60548]: DEBUG oslo_concurrency.lockutils [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] Lock "e3fd811a-186d-436f-bdef-a910a3ccd416-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 964.834431] env[60548]: DEBUG nova.compute.manager [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] No waiting events found dispatching network-vif-plugged-0fecdaa6-f624-45a1-b74d-c1443e12532d {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 964.834431] env[60548]: WARNING nova.compute.manager [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Received unexpected event network-vif-plugged-0fecdaa6-f624-45a1-b74d-c1443e12532d for instance with vm_state building and task_state spawning. [ 964.834647] env[60548]: DEBUG nova.compute.manager [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Received event network-changed-0fecdaa6-f624-45a1-b74d-c1443e12532d {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 964.834731] env[60548]: DEBUG nova.compute.manager [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Refreshing instance network info cache due to event network-changed-0fecdaa6-f624-45a1-b74d-c1443e12532d. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 964.834915] env[60548]: DEBUG oslo_concurrency.lockutils [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] Acquiring lock "refresh_cache-e3fd811a-186d-436f-bdef-a910a3ccd416" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 964.835051] env[60548]: DEBUG oslo_concurrency.lockutils [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] Acquired lock "refresh_cache-e3fd811a-186d-436f-bdef-a910a3ccd416" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 964.835364] env[60548]: DEBUG nova.network.neutron [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Refreshing network info cache for port 0fecdaa6-f624-45a1-b74d-c1443e12532d {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 964.850929] env[60548]: DEBUG nova.compute.manager [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 964.931203] env[60548]: DEBUG nova.compute.manager [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 964.965572] env[60548]: DEBUG nova.virt.hardware [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 964.965964] env[60548]: DEBUG nova.virt.hardware [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 964.966211] env[60548]: DEBUG nova.virt.hardware [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 964.966410] env[60548]: DEBUG nova.virt.hardware [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 964.966562] env[60548]: DEBUG nova.virt.hardware [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 964.966718] env[60548]: DEBUG nova.virt.hardware [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 964.966988] env[60548]: DEBUG nova.virt.hardware [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 964.967271] env[60548]: DEBUG nova.virt.hardware [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 964.967379] env[60548]: DEBUG nova.virt.hardware [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 964.967578] env[60548]: DEBUG nova.virt.hardware [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 964.967757] env[60548]: DEBUG nova.virt.hardware [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 964.969100] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-377001cd-062a-499b-b969-ffc7c079b9e7 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.987080] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7e32c4e-259e-4bff-bd93-1979ecf792d4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 964.993032] env[60548]: DEBUG nova.policy [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1ef4b39480314d249536efa5b839e1fe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e843b5ba56114cf99c1ae8c7e8617e73', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 965.013056] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 965.013056] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 965.013412] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 965.076398] env[60548]: DEBUG nova.network.neutron [req-b88e6a5b-b24c-4eff-accc-a208ef7b070e req-689f66ad-ccfb-49d4-8eee-6d500d620f39 service nova] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Updated VIF entry in instance network info cache for port 59582163-0304-4f82-9a45-39db58047179. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 965.076398] env[60548]: DEBUG nova.network.neutron [req-b88e6a5b-b24c-4eff-accc-a208ef7b070e req-689f66ad-ccfb-49d4-8eee-6d500d620f39 service nova] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Updating instance_info_cache with network_info: [{"id": "59582163-0304-4f82-9a45-39db58047179", "address": "fa:16:3e:e2:38:39", "network": {"id": "3078774a-2da5-4c84-b8f9-2cdd08b17c94", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1829706988-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e5343838b9d64cf0aeb72c368f89eea7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ad72c645-a67d-4efd-b563-28e44077e68d", "external-id": "nsx-vlan-transportzone-201", "segmentation_id": 201, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap59582163-03", "ovs_interfaceid": "59582163-0304-4f82-9a45-39db58047179", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 965.078498] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1da64dc5-8b32-41b0-b85d-f32076b2d930 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.089111] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2d12db8-a2f0-4a25-b06e-a2e0b3f37a8f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.094965] env[60548]: DEBUG oslo_concurrency.lockutils [req-b88e6a5b-b24c-4eff-accc-a208ef7b070e req-689f66ad-ccfb-49d4-8eee-6d500d620f39 service nova] Releasing lock "refresh_cache-ad98988d-92aa-4ace-8e40-cd316758002e" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 965.136384] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16473cd8-86c0-4039-bb08-517f6b013705 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.147707] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b95ffa07-33a4-4f06-930c-492017ba3cdb {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.165489] env[60548]: DEBUG nova.compute.provider_tree [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 965.179112] env[60548]: DEBUG nova.scheduler.client.report [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 965.199908] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.409s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 965.200425] env[60548]: DEBUG nova.compute.manager [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Start building networks asynchronously for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 965.243155] env[60548]: DEBUG nova.compute.utils [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Using /dev/sd instead of None {{(pid=60548) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 965.244492] env[60548]: DEBUG nova.compute.manager [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Allocating IP information in the background. {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 965.244675] env[60548]: DEBUG nova.network.neutron [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] allocate_for_instance() {{(pid=60548) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 965.259365] env[60548]: DEBUG nova.compute.manager [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Start building block device mappings for instance. {{(pid=60548) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 965.376323] env[60548]: DEBUG nova.compute.manager [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Start spawning the instance on the hypervisor. {{(pid=60548) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 965.405213] env[60548]: DEBUG nova.virt.hardware [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-07-22T11:07:05Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-07-22T11:06:47Z,direct_url=,disk_format='vmdk',id=5674e50f-0c0c-4f19-8379-104dac34660b,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='8c5f3f2bd0c84c96a1b1dc646afca847',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-07-22T11:06:48Z,virtual_size=,visibility=), allow threads: False {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 965.406065] env[60548]: DEBUG nova.virt.hardware [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Flavor limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 965.406794] env[60548]: DEBUG nova.virt.hardware [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Image limits 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 965.406794] env[60548]: DEBUG nova.virt.hardware [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Flavor pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 965.407008] env[60548]: DEBUG nova.virt.hardware [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Image pref 0:0:0 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 965.407265] env[60548]: DEBUG nova.virt.hardware [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60548) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 965.407614] env[60548]: DEBUG nova.virt.hardware [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 965.408387] env[60548]: DEBUG nova.virt.hardware [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 965.408387] env[60548]: DEBUG nova.virt.hardware [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Got 1 possible topologies {{(pid=60548) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 965.408513] env[60548]: DEBUG nova.virt.hardware [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 965.408591] env[60548]: DEBUG nova.virt.hardware [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60548) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 965.409523] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76da0f0e-1aef-4437-a95a-9fe683b34d4d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.420679] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5cbfc8e8-77d4-463d-a2c1-be8d4913f076 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.444716] env[60548]: DEBUG nova.network.neutron [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Successfully updated port: 6ac0f8a3-c5bf-4a38-8a33-9cafae07546f {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 965.454214] env[60548]: DEBUG oslo_concurrency.lockutils [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Acquiring lock "refresh_cache-e6466fbb-a225-4bbd-839b-f8c4b24d9860" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 965.454366] env[60548]: DEBUG oslo_concurrency.lockutils [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Acquired lock "refresh_cache-e6466fbb-a225-4bbd-839b-f8c4b24d9860" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 965.454524] env[60548]: DEBUG nova.network.neutron [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 965.526711] env[60548]: DEBUG nova.network.neutron [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 965.564111] env[60548]: DEBUG nova.policy [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6e5dbf4eaf1b4692b5269da912ef44fc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '42dfcefd33194f5fb1356f7abc1cccad', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60548) authorize /opt/stack/nova/nova/policy.py:203}} [ 965.632112] env[60548]: DEBUG nova.network.neutron [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Updated VIF entry in instance network info cache for port 0fecdaa6-f624-45a1-b74d-c1443e12532d. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 965.632487] env[60548]: DEBUG nova.network.neutron [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Updating instance_info_cache with network_info: [{"id": "0fecdaa6-f624-45a1-b74d-c1443e12532d", "address": "fa:16:3e:20:ed:18", "network": {"id": "d5f86510-a301-4081-92ea-e8d3f51eda39", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1500644825-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e712a58321b3496d851d463771623d15", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a9abd00f-2cea-40f8-9804-a56b6431192d", "external-id": "nsx-vlan-transportzone-639", "segmentation_id": 639, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0fecdaa6-f6", "ovs_interfaceid": "0fecdaa6-f624-45a1-b74d-c1443e12532d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 965.644601] env[60548]: DEBUG oslo_concurrency.lockutils [req-bc45709f-b739-4d01-a7d7-9978a215b73e req-02e51e2d-6400-4906-8427-8c5755196e5e service nova] Releasing lock "refresh_cache-e3fd811a-186d-436f-bdef-a910a3ccd416" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 965.873953] env[60548]: DEBUG nova.network.neutron [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Updating instance_info_cache with network_info: [{"id": "6ac0f8a3-c5bf-4a38-8a33-9cafae07546f", "address": "fa:16:3e:e3:98:27", "network": {"id": "1987a79c-d925-4b2e-8a09-3cee95154162", "bridge": "br-int", "label": "tempest-ServersTestJSON-1864915958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ff13b85e9c649c4911ee029ff0304f4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "418ddd3d-5f64-407e-8e0c-c8b81639bee9", "external-id": "nsx-vlan-transportzone-107", "segmentation_id": 107, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6ac0f8a3-c5", "ovs_interfaceid": "6ac0f8a3-c5bf-4a38-8a33-9cafae07546f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 965.886170] env[60548]: DEBUG oslo_concurrency.lockutils [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Releasing lock "refresh_cache-e6466fbb-a225-4bbd-839b-f8c4b24d9860" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 965.886500] env[60548]: DEBUG nova.compute.manager [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Instance network_info: |[{"id": "6ac0f8a3-c5bf-4a38-8a33-9cafae07546f", "address": "fa:16:3e:e3:98:27", "network": {"id": "1987a79c-d925-4b2e-8a09-3cee95154162", "bridge": "br-int", "label": "tempest-ServersTestJSON-1864915958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ff13b85e9c649c4911ee029ff0304f4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "418ddd3d-5f64-407e-8e0c-c8b81639bee9", "external-id": "nsx-vlan-transportzone-107", "segmentation_id": 107, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6ac0f8a3-c5", "ovs_interfaceid": "6ac0f8a3-c5bf-4a38-8a33-9cafae07546f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 965.887151] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e3:98:27', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '418ddd3d-5f64-407e-8e0c-c8b81639bee9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6ac0f8a3-c5bf-4a38-8a33-9cafae07546f', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 965.901009] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Creating folder: Project (4ff13b85e9c649c4911ee029ff0304f4). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 965.903806] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f69f3978-755d-4dda-87cb-3963381f135d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.921254] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Created folder: Project (4ff13b85e9c649c4911ee029ff0304f4) in parent group-v850287. [ 965.921254] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Creating folder: Instances. Parent ref: group-v850355. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 965.921254] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4711b49f-1e2d-4330-8f60-3c06c6b4ce63 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.935026] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Created folder: Instances in parent group-v850355. [ 965.935026] env[60548]: DEBUG oslo.service.loopingcall [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 965.935026] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 965.935026] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2808f2cc-012b-4802-9873-832087b940ee {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.953019] env[60548]: DEBUG nova.network.neutron [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Successfully created port: beafefc3-0ed1-4ee5-9e95-b8befdfc339d {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 965.964021] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 965.964021] env[60548]: value = "task-4323401" [ 965.964021] env[60548]: _type = "Task" [ 965.964021] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 965.972428] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323401, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 966.467602] env[60548]: DEBUG nova.network.neutron [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Successfully created port: 8fd59865-b405-4979-a4d6-1dae122ce7c5 {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 966.474474] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323401, 'name': CreateVM_Task, 'duration_secs': 0.32979} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 966.474799] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 966.475936] env[60548]: DEBUG oslo_concurrency.lockutils [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 966.476267] env[60548]: DEBUG oslo_concurrency.lockutils [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 966.476859] env[60548]: DEBUG oslo_concurrency.lockutils [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 966.477278] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7b42dd02-7030-4a87-9b98-f7b73b9969d4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 966.483541] env[60548]: DEBUG oslo_vmware.api [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Waiting for the task: (returnval){ [ 966.483541] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]521d80c6-ded6-2e4f-0bc2-1755c19eb255" [ 966.483541] env[60548]: _type = "Task" [ 966.483541] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 966.498042] env[60548]: DEBUG oslo_vmware.api [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]521d80c6-ded6-2e4f-0bc2-1755c19eb255, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 966.569640] env[60548]: DEBUG nova.network.neutron [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Successfully updated port: 1c9d862d-8a0d-4c91-834b-72d52fd1fcb7 {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 966.583565] env[60548]: DEBUG oslo_concurrency.lockutils [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Acquiring lock "refresh_cache-979c5fe5-051f-4a43-be2f-571aad25a4ae" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 966.583565] env[60548]: DEBUG oslo_concurrency.lockutils [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Acquired lock "refresh_cache-979c5fe5-051f-4a43-be2f-571aad25a4ae" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 966.583565] env[60548]: DEBUG nova.network.neutron [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 966.750230] env[60548]: DEBUG nova.network.neutron [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 966.995121] env[60548]: DEBUG oslo_concurrency.lockutils [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 966.995121] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 966.995121] env[60548]: DEBUG oslo_concurrency.lockutils [None req-a26ed563-f496-4f2b-9f25-4d1be3d67373 tempest-ServersTestJSON-844148859 tempest-ServersTestJSON-844148859-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 967.158654] env[60548]: DEBUG nova.network.neutron [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Successfully updated port: beafefc3-0ed1-4ee5-9e95-b8befdfc339d {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 967.174891] env[60548]: DEBUG oslo_concurrency.lockutils [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Acquiring lock "refresh_cache-d0d515a4-15ce-4276-b151-34a8a556a1df" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 967.174891] env[60548]: DEBUG oslo_concurrency.lockutils [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Acquired lock "refresh_cache-d0d515a4-15ce-4276-b151-34a8a556a1df" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 967.174891] env[60548]: DEBUG nova.network.neutron [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 967.225710] env[60548]: DEBUG nova.network.neutron [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 967.423297] env[60548]: DEBUG nova.network.neutron [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Updating instance_info_cache with network_info: [{"id": "beafefc3-0ed1-4ee5-9e95-b8befdfc339d", "address": "fa:16:3e:c0:b6:0d", "network": {"id": "016d838c-8bd6-4390-bfdb-e9590ff7afa1", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-47026491-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e843b5ba56114cf99c1ae8c7e8617e73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c1195acd-707f-4bac-a99d-14db17a63802", "external-id": "nsx-vlan-transportzone-322", "segmentation_id": 322, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbeafefc3-0e", "ovs_interfaceid": "beafefc3-0ed1-4ee5-9e95-b8befdfc339d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 967.433717] env[60548]: DEBUG oslo_concurrency.lockutils [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Releasing lock "refresh_cache-d0d515a4-15ce-4276-b151-34a8a556a1df" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 967.434343] env[60548]: DEBUG nova.compute.manager [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Instance network_info: |[{"id": "beafefc3-0ed1-4ee5-9e95-b8befdfc339d", "address": "fa:16:3e:c0:b6:0d", "network": {"id": "016d838c-8bd6-4390-bfdb-e9590ff7afa1", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-47026491-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e843b5ba56114cf99c1ae8c7e8617e73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c1195acd-707f-4bac-a99d-14db17a63802", "external-id": "nsx-vlan-transportzone-322", "segmentation_id": 322, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbeafefc3-0e", "ovs_interfaceid": "beafefc3-0ed1-4ee5-9e95-b8befdfc339d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 967.435278] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c0:b6:0d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c1195acd-707f-4bac-a99d-14db17a63802', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'beafefc3-0ed1-4ee5-9e95-b8befdfc339d', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 967.444381] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Creating folder: Project (e843b5ba56114cf99c1ae8c7e8617e73). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 967.445324] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-312a8703-f2b8-4605-852b-2ab1669536fc {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 967.458303] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Created folder: Project (e843b5ba56114cf99c1ae8c7e8617e73) in parent group-v850287. [ 967.458846] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Creating folder: Instances. Parent ref: group-v850358. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 967.459878] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1599891c-b4f3-4031-9498-f4c1f335c765 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 967.472136] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Created folder: Instances in parent group-v850358. [ 967.472136] env[60548]: DEBUG oslo.service.loopingcall [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 967.472136] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 967.472136] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-72387e23-f05a-4449-8c58-4733f0de331b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 967.492511] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 967.492511] env[60548]: value = "task-4323404" [ 967.492511] env[60548]: _type = "Task" [ 967.492511] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 967.501613] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323404, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 967.740059] env[60548]: DEBUG nova.network.neutron [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Successfully created port: a00a8e60-5bdf-44fc-bc48-27c65f02a00c {{(pid=60548) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 967.838402] env[60548]: DEBUG nova.network.neutron [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Updating instance_info_cache with network_info: [{"id": "1c9d862d-8a0d-4c91-834b-72d52fd1fcb7", "address": "fa:16:3e:df:18:4a", "network": {"id": "dd10a5ee-4ec7-4c98-be7b-6cb397c902f1", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1884105123-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b69a6d4834e24a3cbc4019ee66c0d841", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "23fc30ea-1f06-424d-86e1-27ae5435b1a9", "external-id": "nsx-vlan-transportzone-189", "segmentation_id": 189, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1c9d862d-8a", "ovs_interfaceid": "1c9d862d-8a0d-4c91-834b-72d52fd1fcb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 967.851191] env[60548]: DEBUG oslo_concurrency.lockutils [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Releasing lock "refresh_cache-979c5fe5-051f-4a43-be2f-571aad25a4ae" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 967.851191] env[60548]: DEBUG nova.compute.manager [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Instance network_info: |[{"id": "1c9d862d-8a0d-4c91-834b-72d52fd1fcb7", "address": "fa:16:3e:df:18:4a", "network": {"id": "dd10a5ee-4ec7-4c98-be7b-6cb397c902f1", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1884105123-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b69a6d4834e24a3cbc4019ee66c0d841", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "23fc30ea-1f06-424d-86e1-27ae5435b1a9", "external-id": "nsx-vlan-transportzone-189", "segmentation_id": 189, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1c9d862d-8a", "ovs_interfaceid": "1c9d862d-8a0d-4c91-834b-72d52fd1fcb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 967.851191] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:df:18:4a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '23fc30ea-1f06-424d-86e1-27ae5435b1a9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1c9d862d-8a0d-4c91-834b-72d52fd1fcb7', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 967.861208] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Creating folder: Project (b69a6d4834e24a3cbc4019ee66c0d841). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 967.862089] env[60548]: DEBUG nova.network.neutron [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Successfully updated port: c95ecd04-2d9a-4470-a5bf-7b4eebae8e2b {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 967.866232] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1a29090a-0208-4086-99a9-95ad9b561be6 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 967.870880] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquiring lock "refresh_cache-8f447658-c66d-4d94-af30-fd43c83dae0e" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 967.871338] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquired lock "refresh_cache-8f447658-c66d-4d94-af30-fd43c83dae0e" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 967.871657] env[60548]: DEBUG nova.network.neutron [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 967.881472] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Created folder: Project (b69a6d4834e24a3cbc4019ee66c0d841) in parent group-v850287. [ 967.881472] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Creating folder: Instances. Parent ref: group-v850361. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 967.881472] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-22799399-39c6-427b-bc83-ee8c37609efd {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 967.899831] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Created folder: Instances in parent group-v850361. [ 967.899831] env[60548]: DEBUG oslo.service.loopingcall [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 967.899831] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 967.899831] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-546c39aa-89ce-4661-bf33-82e5ecd19a27 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 967.925021] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 967.925021] env[60548]: value = "task-4323407" [ 967.925021] env[60548]: _type = "Task" [ 967.925021] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 967.932809] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323407, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 967.984798] env[60548]: DEBUG nova.network.neutron [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 968.007222] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323404, 'name': CreateVM_Task, 'duration_secs': 0.305052} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 968.008533] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 968.010006] env[60548]: DEBUG nova.compute.manager [req-90340ddf-7f2e-4da3-9d10-9b3b66c11446 req-c11a8a2e-0413-4f19-8ed1-2132c47297b0 service nova] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Received event network-vif-plugged-6ac0f8a3-c5bf-4a38-8a33-9cafae07546f {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 968.013541] env[60548]: DEBUG oslo_concurrency.lockutils [req-90340ddf-7f2e-4da3-9d10-9b3b66c11446 req-c11a8a2e-0413-4f19-8ed1-2132c47297b0 service nova] Acquiring lock "e6466fbb-a225-4bbd-839b-f8c4b24d9860-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 968.013771] env[60548]: DEBUG oslo_concurrency.lockutils [req-90340ddf-7f2e-4da3-9d10-9b3b66c11446 req-c11a8a2e-0413-4f19-8ed1-2132c47297b0 service nova] Lock "e6466fbb-a225-4bbd-839b-f8c4b24d9860-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 968.013939] env[60548]: DEBUG oslo_concurrency.lockutils [req-90340ddf-7f2e-4da3-9d10-9b3b66c11446 req-c11a8a2e-0413-4f19-8ed1-2132c47297b0 service nova] Lock "e6466fbb-a225-4bbd-839b-f8c4b24d9860-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 968.014121] env[60548]: DEBUG nova.compute.manager [req-90340ddf-7f2e-4da3-9d10-9b3b66c11446 req-c11a8a2e-0413-4f19-8ed1-2132c47297b0 service nova] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] No waiting events found dispatching network-vif-plugged-6ac0f8a3-c5bf-4a38-8a33-9cafae07546f {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 968.014290] env[60548]: WARNING nova.compute.manager [req-90340ddf-7f2e-4da3-9d10-9b3b66c11446 req-c11a8a2e-0413-4f19-8ed1-2132c47297b0 service nova] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Received unexpected event network-vif-plugged-6ac0f8a3-c5bf-4a38-8a33-9cafae07546f for instance with vm_state building and task_state spawning. [ 968.015014] env[60548]: DEBUG oslo_concurrency.lockutils [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 968.015174] env[60548]: DEBUG oslo_concurrency.lockutils [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 968.015500] env[60548]: DEBUG oslo_concurrency.lockutils [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 968.016279] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6048090b-c443-4764-aedb-ee476427d51a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 968.024512] env[60548]: DEBUG oslo_vmware.api [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Waiting for the task: (returnval){ [ 968.024512] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52835e4d-56bb-a968-551f-e67ef42730b2" [ 968.024512] env[60548]: _type = "Task" [ 968.024512] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 968.025782] env[60548]: DEBUG nova.compute.manager [req-dcc14c41-7edc-41b8-b4c3-bee540c435b5 req-7a8b8eb8-ef90-4150-a3d1-9cb9d79e66f6 service nova] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Received event network-vif-plugged-1c9d862d-8a0d-4c91-834b-72d52fd1fcb7 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 968.025978] env[60548]: DEBUG oslo_concurrency.lockutils [req-dcc14c41-7edc-41b8-b4c3-bee540c435b5 req-7a8b8eb8-ef90-4150-a3d1-9cb9d79e66f6 service nova] Acquiring lock "979c5fe5-051f-4a43-be2f-571aad25a4ae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 968.026223] env[60548]: DEBUG oslo_concurrency.lockutils [req-dcc14c41-7edc-41b8-b4c3-bee540c435b5 req-7a8b8eb8-ef90-4150-a3d1-9cb9d79e66f6 service nova] Lock "979c5fe5-051f-4a43-be2f-571aad25a4ae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 968.026398] env[60548]: DEBUG oslo_concurrency.lockutils [req-dcc14c41-7edc-41b8-b4c3-bee540c435b5 req-7a8b8eb8-ef90-4150-a3d1-9cb9d79e66f6 service nova] Lock "979c5fe5-051f-4a43-be2f-571aad25a4ae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 968.026567] env[60548]: DEBUG nova.compute.manager [req-dcc14c41-7edc-41b8-b4c3-bee540c435b5 req-7a8b8eb8-ef90-4150-a3d1-9cb9d79e66f6 service nova] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] No waiting events found dispatching network-vif-plugged-1c9d862d-8a0d-4c91-834b-72d52fd1fcb7 {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 968.026760] env[60548]: WARNING nova.compute.manager [req-dcc14c41-7edc-41b8-b4c3-bee540c435b5 req-7a8b8eb8-ef90-4150-a3d1-9cb9d79e66f6 service nova] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Received unexpected event network-vif-plugged-1c9d862d-8a0d-4c91-834b-72d52fd1fcb7 for instance with vm_state building and task_state spawning. [ 968.041159] env[60548]: DEBUG oslo_vmware.api [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52835e4d-56bb-a968-551f-e67ef42730b2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 968.045486] env[60548]: DEBUG nova.compute.manager [req-291f3b2c-b200-40b5-be82-9d311900738c req-92228833-ab5e-4293-a3a3-2e6f1ec6ee8b service nova] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Received event network-changed-6ac0f8a3-c5bf-4a38-8a33-9cafae07546f {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 968.045678] env[60548]: DEBUG nova.compute.manager [req-291f3b2c-b200-40b5-be82-9d311900738c req-92228833-ab5e-4293-a3a3-2e6f1ec6ee8b service nova] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Refreshing instance network info cache due to event network-changed-6ac0f8a3-c5bf-4a38-8a33-9cafae07546f. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 968.045911] env[60548]: DEBUG oslo_concurrency.lockutils [req-291f3b2c-b200-40b5-be82-9d311900738c req-92228833-ab5e-4293-a3a3-2e6f1ec6ee8b service nova] Acquiring lock "refresh_cache-e6466fbb-a225-4bbd-839b-f8c4b24d9860" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 968.046081] env[60548]: DEBUG oslo_concurrency.lockutils [req-291f3b2c-b200-40b5-be82-9d311900738c req-92228833-ab5e-4293-a3a3-2e6f1ec6ee8b service nova] Acquired lock "refresh_cache-e6466fbb-a225-4bbd-839b-f8c4b24d9860" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 968.046246] env[60548]: DEBUG nova.network.neutron [req-291f3b2c-b200-40b5-be82-9d311900738c req-92228833-ab5e-4293-a3a3-2e6f1ec6ee8b service nova] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Refreshing network info cache for port 6ac0f8a3-c5bf-4a38-8a33-9cafae07546f {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 968.433642] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323407, 'name': CreateVM_Task, 'duration_secs': 0.30836} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 968.433821] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 968.434610] env[60548]: DEBUG oslo_concurrency.lockutils [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 968.538404] env[60548]: DEBUG oslo_concurrency.lockutils [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 968.538686] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 968.538941] env[60548]: DEBUG oslo_concurrency.lockutils [None req-f9eca8e6-2e63-4f31-994e-5c4047c0e98b tempest-AttachInterfacesTestJSON-943622355 tempest-AttachInterfacesTestJSON-943622355-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 968.539154] env[60548]: DEBUG oslo_concurrency.lockutils [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 968.539464] env[60548]: DEBUG oslo_concurrency.lockutils [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 968.539723] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-408d368c-2a39-400e-b890-4b676451c40d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 968.545647] env[60548]: DEBUG oslo_vmware.api [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Waiting for the task: (returnval){ [ 968.545647] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52d59700-7b18-9902-4cb1-98850d29afc0" [ 968.545647] env[60548]: _type = "Task" [ 968.545647] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 968.557018] env[60548]: DEBUG oslo_vmware.api [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52d59700-7b18-9902-4cb1-98850d29afc0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 968.751835] env[60548]: DEBUG nova.network.neutron [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Updating instance_info_cache with network_info: [{"id": "c95ecd04-2d9a-4470-a5bf-7b4eebae8e2b", "address": "fa:16:3e:aa:59:d3", "network": {"id": "d5f86510-a301-4081-92ea-e8d3f51eda39", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1500644825-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e712a58321b3496d851d463771623d15", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a9abd00f-2cea-40f8-9804-a56b6431192d", "external-id": "nsx-vlan-transportzone-639", "segmentation_id": 639, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc95ecd04-2d", "ovs_interfaceid": "c95ecd04-2d9a-4470-a5bf-7b4eebae8e2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 968.767934] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Releasing lock "refresh_cache-8f447658-c66d-4d94-af30-fd43c83dae0e" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 968.768302] env[60548]: DEBUG nova.compute.manager [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Instance network_info: |[{"id": "c95ecd04-2d9a-4470-a5bf-7b4eebae8e2b", "address": "fa:16:3e:aa:59:d3", "network": {"id": "d5f86510-a301-4081-92ea-e8d3f51eda39", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1500644825-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e712a58321b3496d851d463771623d15", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a9abd00f-2cea-40f8-9804-a56b6431192d", "external-id": "nsx-vlan-transportzone-639", "segmentation_id": 639, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc95ecd04-2d", "ovs_interfaceid": "c95ecd04-2d9a-4470-a5bf-7b4eebae8e2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 968.768936] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:aa:59:d3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a9abd00f-2cea-40f8-9804-a56b6431192d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c95ecd04-2d9a-4470-a5bf-7b4eebae8e2b', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 968.777301] env[60548]: DEBUG oslo.service.loopingcall [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 968.777843] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 968.778096] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-553ba871-49d4-4462-a2a9-f3504dad8e9e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 968.800676] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 968.800676] env[60548]: value = "task-4323408" [ 968.800676] env[60548]: _type = "Task" [ 968.800676] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 968.809927] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323408, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 969.060768] env[60548]: DEBUG oslo_concurrency.lockutils [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 969.060768] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 969.060768] env[60548]: DEBUG oslo_concurrency.lockutils [None req-26693311-fafe-4637-808f-85f1e9ea6a77 tempest-ImagesOneServerNegativeTestJSON-1667424217 tempest-ImagesOneServerNegativeTestJSON-1667424217-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 969.311839] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323408, 'name': CreateVM_Task, 'duration_secs': 0.309494} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 969.312162] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 969.312915] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 969.313229] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 969.313620] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 969.315017] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-056fa14e-b90e-4e75-b67c-2ccb26be710d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 969.319827] env[60548]: DEBUG oslo_vmware.api [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Waiting for the task: (returnval){ [ 969.319827] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52676dc8-bb85-c1a3-1b68-7b4e62651bbe" [ 969.319827] env[60548]: _type = "Task" [ 969.319827] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 969.329066] env[60548]: DEBUG oslo_vmware.api [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52676dc8-bb85-c1a3-1b68-7b4e62651bbe, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 969.379178] env[60548]: DEBUG nova.network.neutron [req-291f3b2c-b200-40b5-be82-9d311900738c req-92228833-ab5e-4293-a3a3-2e6f1ec6ee8b service nova] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Updated VIF entry in instance network info cache for port 6ac0f8a3-c5bf-4a38-8a33-9cafae07546f. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 969.379333] env[60548]: DEBUG nova.network.neutron [req-291f3b2c-b200-40b5-be82-9d311900738c req-92228833-ab5e-4293-a3a3-2e6f1ec6ee8b service nova] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Updating instance_info_cache with network_info: [{"id": "6ac0f8a3-c5bf-4a38-8a33-9cafae07546f", "address": "fa:16:3e:e3:98:27", "network": {"id": "1987a79c-d925-4b2e-8a09-3cee95154162", "bridge": "br-int", "label": "tempest-ServersTestJSON-1864915958-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4ff13b85e9c649c4911ee029ff0304f4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "418ddd3d-5f64-407e-8e0c-c8b81639bee9", "external-id": "nsx-vlan-transportzone-107", "segmentation_id": 107, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6ac0f8a3-c5", "ovs_interfaceid": "6ac0f8a3-c5bf-4a38-8a33-9cafae07546f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 969.391810] env[60548]: DEBUG oslo_concurrency.lockutils [req-291f3b2c-b200-40b5-be82-9d311900738c req-92228833-ab5e-4293-a3a3-2e6f1ec6ee8b service nova] Releasing lock "refresh_cache-e6466fbb-a225-4bbd-839b-f8c4b24d9860" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 969.681433] env[60548]: DEBUG nova.network.neutron [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Successfully updated port: 8fd59865-b405-4979-a4d6-1dae122ce7c5 {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 969.703172] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Acquiring lock "refresh_cache-3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 969.703172] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Acquired lock "refresh_cache-3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 969.703172] env[60548]: DEBUG nova.network.neutron [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 969.792115] env[60548]: DEBUG nova.network.neutron [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 969.834111] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 969.834766] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 969.835176] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 970.195538] env[60548]: DEBUG nova.compute.manager [req-8609974f-1934-41d8-9740-10e28d6971c0 req-d8c2cdf0-a31d-40cd-9ef1-7e0b431de2f9 service nova] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Received event network-vif-plugged-beafefc3-0ed1-4ee5-9e95-b8befdfc339d {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 970.195538] env[60548]: DEBUG oslo_concurrency.lockutils [req-8609974f-1934-41d8-9740-10e28d6971c0 req-d8c2cdf0-a31d-40cd-9ef1-7e0b431de2f9 service nova] Acquiring lock "d0d515a4-15ce-4276-b151-34a8a556a1df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 970.195538] env[60548]: DEBUG oslo_concurrency.lockutils [req-8609974f-1934-41d8-9740-10e28d6971c0 req-d8c2cdf0-a31d-40cd-9ef1-7e0b431de2f9 service nova] Lock "d0d515a4-15ce-4276-b151-34a8a556a1df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 970.195538] env[60548]: DEBUG oslo_concurrency.lockutils [req-8609974f-1934-41d8-9740-10e28d6971c0 req-d8c2cdf0-a31d-40cd-9ef1-7e0b431de2f9 service nova] Lock "d0d515a4-15ce-4276-b151-34a8a556a1df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 970.195538] env[60548]: DEBUG nova.compute.manager [req-8609974f-1934-41d8-9740-10e28d6971c0 req-d8c2cdf0-a31d-40cd-9ef1-7e0b431de2f9 service nova] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] No waiting events found dispatching network-vif-plugged-beafefc3-0ed1-4ee5-9e95-b8befdfc339d {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 970.195538] env[60548]: WARNING nova.compute.manager [req-8609974f-1934-41d8-9740-10e28d6971c0 req-d8c2cdf0-a31d-40cd-9ef1-7e0b431de2f9 service nova] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Received unexpected event network-vif-plugged-beafefc3-0ed1-4ee5-9e95-b8befdfc339d for instance with vm_state building and task_state spawning. [ 970.214941] env[60548]: DEBUG nova.network.neutron [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Successfully updated port: a00a8e60-5bdf-44fc-bc48-27c65f02a00c {{(pid=60548) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 970.222144] env[60548]: DEBUG nova.compute.manager [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Received event network-changed-1c9d862d-8a0d-4c91-834b-72d52fd1fcb7 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 970.222144] env[60548]: DEBUG nova.compute.manager [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Refreshing instance network info cache due to event network-changed-1c9d862d-8a0d-4c91-834b-72d52fd1fcb7. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 970.222144] env[60548]: DEBUG oslo_concurrency.lockutils [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] Acquiring lock "refresh_cache-979c5fe5-051f-4a43-be2f-571aad25a4ae" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 970.222144] env[60548]: DEBUG oslo_concurrency.lockutils [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] Acquired lock "refresh_cache-979c5fe5-051f-4a43-be2f-571aad25a4ae" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 970.223568] env[60548]: DEBUG nova.network.neutron [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Refreshing network info cache for port 1c9d862d-8a0d-4c91-834b-72d52fd1fcb7 {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 970.230028] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Acquiring lock "refresh_cache-3a4668ee-e420-4ad8-b638-95b3d55e00c1" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 970.230230] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Acquired lock "refresh_cache-3a4668ee-e420-4ad8-b638-95b3d55e00c1" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 970.230396] env[60548]: DEBUG nova.network.neutron [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Building network info cache for instance {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 970.244634] env[60548]: DEBUG nova.compute.manager [req-b4c50ab4-ca4e-4efd-a9b3-6ae14eec8a8b req-802b8c9e-6e17-41e4-82fb-14ff021ac028 service nova] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Received event network-changed-beafefc3-0ed1-4ee5-9e95-b8befdfc339d {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 970.244830] env[60548]: DEBUG nova.compute.manager [req-b4c50ab4-ca4e-4efd-a9b3-6ae14eec8a8b req-802b8c9e-6e17-41e4-82fb-14ff021ac028 service nova] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Refreshing instance network info cache due to event network-changed-beafefc3-0ed1-4ee5-9e95-b8befdfc339d. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 970.245139] env[60548]: DEBUG oslo_concurrency.lockutils [req-b4c50ab4-ca4e-4efd-a9b3-6ae14eec8a8b req-802b8c9e-6e17-41e4-82fb-14ff021ac028 service nova] Acquiring lock "refresh_cache-d0d515a4-15ce-4276-b151-34a8a556a1df" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 970.245284] env[60548]: DEBUG oslo_concurrency.lockutils [req-b4c50ab4-ca4e-4efd-a9b3-6ae14eec8a8b req-802b8c9e-6e17-41e4-82fb-14ff021ac028 service nova] Acquired lock "refresh_cache-d0d515a4-15ce-4276-b151-34a8a556a1df" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 970.245445] env[60548]: DEBUG nova.network.neutron [req-b4c50ab4-ca4e-4efd-a9b3-6ae14eec8a8b req-802b8c9e-6e17-41e4-82fb-14ff021ac028 service nova] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Refreshing network info cache for port beafefc3-0ed1-4ee5-9e95-b8befdfc339d {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 970.286918] env[60548]: DEBUG nova.network.neutron [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Updating instance_info_cache with network_info: [{"id": "8fd59865-b405-4979-a4d6-1dae122ce7c5", "address": "fa:16:3e:c7:e9:85", "network": {"id": "a0196fe2-ed4b-4afd-a480-0086ee6ae168", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-794474085-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4c0d2d521309468ba27acf66751f02f7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92e4d027-e755-417b-8eea-9a8f24b85140", "external-id": "nsx-vlan-transportzone-756", "segmentation_id": 756, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8fd59865-b4", "ovs_interfaceid": "8fd59865-b405-4979-a4d6-1dae122ce7c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 970.297174] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Releasing lock "refresh_cache-3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 970.297754] env[60548]: DEBUG nova.compute.manager [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Instance network_info: |[{"id": "8fd59865-b405-4979-a4d6-1dae122ce7c5", "address": "fa:16:3e:c7:e9:85", "network": {"id": "a0196fe2-ed4b-4afd-a480-0086ee6ae168", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-794474085-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4c0d2d521309468ba27acf66751f02f7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92e4d027-e755-417b-8eea-9a8f24b85140", "external-id": "nsx-vlan-transportzone-756", "segmentation_id": 756, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8fd59865-b4", "ovs_interfaceid": "8fd59865-b405-4979-a4d6-1dae122ce7c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 970.298337] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c7:e9:85', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '92e4d027-e755-417b-8eea-9a8f24b85140', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8fd59865-b405-4979-a4d6-1dae122ce7c5', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 970.305957] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Creating folder: Project (4c0d2d521309468ba27acf66751f02f7). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 970.306414] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9c2805b7-7fe7-4e76-a06a-197e91c3b596 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 970.318289] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Created folder: Project (4c0d2d521309468ba27acf66751f02f7) in parent group-v850287. [ 970.318653] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Creating folder: Instances. Parent ref: group-v850365. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 970.318943] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-633dcb2f-14cb-4546-a778-0ba5d1ba4cff {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 970.329290] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Created folder: Instances in parent group-v850365. [ 970.329527] env[60548]: DEBUG oslo.service.loopingcall [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 970.329711] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 970.329908] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-639e46ac-88a4-48cc-a45d-849f98951fe7 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 970.349820] env[60548]: DEBUG nova.network.neutron [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Instance cache missing network info. {{(pid=60548) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 970.356761] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 970.356761] env[60548]: value = "task-4323411" [ 970.356761] env[60548]: _type = "Task" [ 970.356761] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 970.366618] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323411, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 970.868347] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323411, 'name': CreateVM_Task, 'duration_secs': 0.297024} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 970.868742] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 970.869326] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 970.869490] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 970.869798] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 970.870059] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8321b4a4-789d-4463-ad27-cd047370dd33 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 970.874976] env[60548]: DEBUG oslo_vmware.api [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Waiting for the task: (returnval){ [ 970.874976] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52aef1d6-e6a0-574c-76ce-26bbdce9e4a7" [ 970.874976] env[60548]: _type = "Task" [ 970.874976] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 970.885472] env[60548]: DEBUG oslo_vmware.api [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52aef1d6-e6a0-574c-76ce-26bbdce9e4a7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 971.116428] env[60548]: DEBUG nova.network.neutron [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Updating instance_info_cache with network_info: [{"id": "a00a8e60-5bdf-44fc-bc48-27c65f02a00c", "address": "fa:16:3e:99:ea:b7", "network": {"id": "542e1d67-cf01-44bc-93b3-0e0ae5507f08", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-273292017-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "42dfcefd33194f5fb1356f7abc1cccad", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd96b39f-bd2e-48d1-85c3-577cf97f08c8", "external-id": "cl2-zone-84", "segmentation_id": 84, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa00a8e60-5b", "ovs_interfaceid": "a00a8e60-5bdf-44fc-bc48-27c65f02a00c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 971.127374] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Releasing lock "refresh_cache-3a4668ee-e420-4ad8-b638-95b3d55e00c1" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 971.127683] env[60548]: DEBUG nova.compute.manager [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Instance network_info: |[{"id": "a00a8e60-5bdf-44fc-bc48-27c65f02a00c", "address": "fa:16:3e:99:ea:b7", "network": {"id": "542e1d67-cf01-44bc-93b3-0e0ae5507f08", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-273292017-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "42dfcefd33194f5fb1356f7abc1cccad", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd96b39f-bd2e-48d1-85c3-577cf97f08c8", "external-id": "cl2-zone-84", "segmentation_id": 84, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa00a8e60-5b", "ovs_interfaceid": "a00a8e60-5bdf-44fc-bc48-27c65f02a00c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60548) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 971.128148] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:99:ea:b7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'dd96b39f-bd2e-48d1-85c3-577cf97f08c8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a00a8e60-5bdf-44fc-bc48-27c65f02a00c', 'vif_model': 'vmxnet3'}] {{(pid=60548) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 971.137419] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Creating folder: Project (42dfcefd33194f5fb1356f7abc1cccad). Parent ref: group-v850287. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 971.138091] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-663918d7-5ab8-4906-8530-4ebd89a558d4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 971.149864] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Created folder: Project (42dfcefd33194f5fb1356f7abc1cccad) in parent group-v850287. [ 971.150114] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Creating folder: Instances. Parent ref: group-v850368. {{(pid=60548) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 971.150422] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6012ab46-7856-4a41-acb3-949406272b7d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 971.163020] env[60548]: INFO nova.virt.vmwareapi.vm_util [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Created folder: Instances in parent group-v850368. [ 971.163274] env[60548]: DEBUG oslo.service.loopingcall [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60548) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 971.163467] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Creating VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 971.163693] env[60548]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c43dd31c-1875-4672-a361-d2ccfd3b930b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 971.187214] env[60548]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 971.187214] env[60548]: value = "task-4323414" [ 971.187214] env[60548]: _type = "Task" [ 971.187214] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 971.197549] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323414, 'name': CreateVM_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 971.314370] env[60548]: DEBUG nova.network.neutron [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Updated VIF entry in instance network info cache for port 1c9d862d-8a0d-4c91-834b-72d52fd1fcb7. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 971.314792] env[60548]: DEBUG nova.network.neutron [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Updating instance_info_cache with network_info: [{"id": "1c9d862d-8a0d-4c91-834b-72d52fd1fcb7", "address": "fa:16:3e:df:18:4a", "network": {"id": "dd10a5ee-4ec7-4c98-be7b-6cb397c902f1", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1884105123-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b69a6d4834e24a3cbc4019ee66c0d841", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "23fc30ea-1f06-424d-86e1-27ae5435b1a9", "external-id": "nsx-vlan-transportzone-189", "segmentation_id": 189, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1c9d862d-8a", "ovs_interfaceid": "1c9d862d-8a0d-4c91-834b-72d52fd1fcb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 971.322373] env[60548]: DEBUG nova.network.neutron [req-b4c50ab4-ca4e-4efd-a9b3-6ae14eec8a8b req-802b8c9e-6e17-41e4-82fb-14ff021ac028 service nova] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Updated VIF entry in instance network info cache for port beafefc3-0ed1-4ee5-9e95-b8befdfc339d. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 971.322864] env[60548]: DEBUG nova.network.neutron [req-b4c50ab4-ca4e-4efd-a9b3-6ae14eec8a8b req-802b8c9e-6e17-41e4-82fb-14ff021ac028 service nova] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Updating instance_info_cache with network_info: [{"id": "beafefc3-0ed1-4ee5-9e95-b8befdfc339d", "address": "fa:16:3e:c0:b6:0d", "network": {"id": "016d838c-8bd6-4390-bfdb-e9590ff7afa1", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-47026491-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e843b5ba56114cf99c1ae8c7e8617e73", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c1195acd-707f-4bac-a99d-14db17a63802", "external-id": "nsx-vlan-transportzone-322", "segmentation_id": 322, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbeafefc3-0e", "ovs_interfaceid": "beafefc3-0ed1-4ee5-9e95-b8befdfc339d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 971.325433] env[60548]: DEBUG oslo_concurrency.lockutils [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] Releasing lock "refresh_cache-979c5fe5-051f-4a43-be2f-571aad25a4ae" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 971.325659] env[60548]: DEBUG nova.compute.manager [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Received event network-vif-plugged-c95ecd04-2d9a-4470-a5bf-7b4eebae8e2b {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 971.325852] env[60548]: DEBUG oslo_concurrency.lockutils [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] Acquiring lock "8f447658-c66d-4d94-af30-fd43c83dae0e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 971.326054] env[60548]: DEBUG oslo_concurrency.lockutils [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] Lock "8f447658-c66d-4d94-af30-fd43c83dae0e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 971.326216] env[60548]: DEBUG oslo_concurrency.lockutils [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] Lock "8f447658-c66d-4d94-af30-fd43c83dae0e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 971.326376] env[60548]: DEBUG nova.compute.manager [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] No waiting events found dispatching network-vif-plugged-c95ecd04-2d9a-4470-a5bf-7b4eebae8e2b {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 971.326536] env[60548]: WARNING nova.compute.manager [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Received unexpected event network-vif-plugged-c95ecd04-2d9a-4470-a5bf-7b4eebae8e2b for instance with vm_state building and task_state spawning. [ 971.326696] env[60548]: DEBUG nova.compute.manager [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Received event network-changed-c95ecd04-2d9a-4470-a5bf-7b4eebae8e2b {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 971.326845] env[60548]: DEBUG nova.compute.manager [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Refreshing instance network info cache due to event network-changed-c95ecd04-2d9a-4470-a5bf-7b4eebae8e2b. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 971.327031] env[60548]: DEBUG oslo_concurrency.lockutils [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] Acquiring lock "refresh_cache-8f447658-c66d-4d94-af30-fd43c83dae0e" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 971.327168] env[60548]: DEBUG oslo_concurrency.lockutils [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] Acquired lock "refresh_cache-8f447658-c66d-4d94-af30-fd43c83dae0e" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 971.327316] env[60548]: DEBUG nova.network.neutron [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Refreshing network info cache for port c95ecd04-2d9a-4470-a5bf-7b4eebae8e2b {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 971.335330] env[60548]: DEBUG oslo_concurrency.lockutils [req-b4c50ab4-ca4e-4efd-a9b3-6ae14eec8a8b req-802b8c9e-6e17-41e4-82fb-14ff021ac028 service nova] Releasing lock "refresh_cache-d0d515a4-15ce-4276-b151-34a8a556a1df" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 971.388297] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 971.388618] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 971.389313] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9103344d-50c2-42d3-af1b-42e70de1a0d4 tempest-AttachVolumeTestJSON-958325147 tempest-AttachVolumeTestJSON-958325147-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 971.699271] env[60548]: DEBUG oslo_vmware.api [-] Task: {'id': task-4323414, 'name': CreateVM_Task, 'duration_secs': 0.306668} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 971.699491] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Created VM on the ESX host {{(pid=60548) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 971.700105] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 971.701604] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 971.701604] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 971.701604] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6a91f808-4454-4dd0-9f33-8079398ff9b4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 971.705550] env[60548]: DEBUG oslo_vmware.api [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Waiting for the task: (returnval){ [ 971.705550] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]523ae158-3b33-e7f4-e328-eded9d0e4d6e" [ 971.705550] env[60548]: _type = "Task" [ 971.705550] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 971.717197] env[60548]: DEBUG oslo_vmware.api [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]523ae158-3b33-e7f4-e328-eded9d0e4d6e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 971.953191] env[60548]: DEBUG nova.network.neutron [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Updated VIF entry in instance network info cache for port c95ecd04-2d9a-4470-a5bf-7b4eebae8e2b. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 971.953581] env[60548]: DEBUG nova.network.neutron [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Updating instance_info_cache with network_info: [{"id": "c95ecd04-2d9a-4470-a5bf-7b4eebae8e2b", "address": "fa:16:3e:aa:59:d3", "network": {"id": "d5f86510-a301-4081-92ea-e8d3f51eda39", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1500644825-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e712a58321b3496d851d463771623d15", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a9abd00f-2cea-40f8-9804-a56b6431192d", "external-id": "nsx-vlan-transportzone-639", "segmentation_id": 639, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc95ecd04-2d", "ovs_interfaceid": "c95ecd04-2d9a-4470-a5bf-7b4eebae8e2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 971.963276] env[60548]: DEBUG oslo_concurrency.lockutils [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] Releasing lock "refresh_cache-8f447658-c66d-4d94-af30-fd43c83dae0e" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 971.963538] env[60548]: DEBUG nova.compute.manager [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Received event network-vif-plugged-8fd59865-b405-4979-a4d6-1dae122ce7c5 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 971.963732] env[60548]: DEBUG oslo_concurrency.lockutils [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] Acquiring lock "3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 971.963929] env[60548]: DEBUG oslo_concurrency.lockutils [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] Lock "3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 971.964100] env[60548]: DEBUG oslo_concurrency.lockutils [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] Lock "3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 971.964343] env[60548]: DEBUG nova.compute.manager [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] No waiting events found dispatching network-vif-plugged-8fd59865-b405-4979-a4d6-1dae122ce7c5 {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 971.964520] env[60548]: WARNING nova.compute.manager [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Received unexpected event network-vif-plugged-8fd59865-b405-4979-a4d6-1dae122ce7c5 for instance with vm_state building and task_state spawning. [ 971.964771] env[60548]: DEBUG nova.compute.manager [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Received event network-changed-8fd59865-b405-4979-a4d6-1dae122ce7c5 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 971.965035] env[60548]: DEBUG nova.compute.manager [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Refreshing instance network info cache due to event network-changed-8fd59865-b405-4979-a4d6-1dae122ce7c5. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 971.965320] env[60548]: DEBUG oslo_concurrency.lockutils [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] Acquiring lock "refresh_cache-3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 971.965539] env[60548]: DEBUG oslo_concurrency.lockutils [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] Acquired lock "refresh_cache-3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 971.965780] env[60548]: DEBUG nova.network.neutron [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Refreshing network info cache for port 8fd59865-b405-4979-a4d6-1dae122ce7c5 {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 972.218426] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 972.218687] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Processing image 5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 972.218928] env[60548]: DEBUG oslo_concurrency.lockutils [None req-e57cb0b7-943b-4dff-b85d-94722ea01a65 tempest-ServerRescueTestJSON-1707014401 tempest-ServerRescueTestJSON-1707014401-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 972.248916] env[60548]: DEBUG nova.compute.manager [req-697f7071-2b29-4b4a-908d-e4828a263cb2 req-fba1ade5-2993-45e2-a3c5-d3c64204ebc8 service nova] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Received event network-changed-a00a8e60-5bdf-44fc-bc48-27c65f02a00c {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 972.249152] env[60548]: DEBUG nova.compute.manager [req-697f7071-2b29-4b4a-908d-e4828a263cb2 req-fba1ade5-2993-45e2-a3c5-d3c64204ebc8 service nova] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Refreshing instance network info cache due to event network-changed-a00a8e60-5bdf-44fc-bc48-27c65f02a00c. {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 972.249379] env[60548]: DEBUG oslo_concurrency.lockutils [req-697f7071-2b29-4b4a-908d-e4828a263cb2 req-fba1ade5-2993-45e2-a3c5-d3c64204ebc8 service nova] Acquiring lock "refresh_cache-3a4668ee-e420-4ad8-b638-95b3d55e00c1" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 972.249541] env[60548]: DEBUG oslo_concurrency.lockutils [req-697f7071-2b29-4b4a-908d-e4828a263cb2 req-fba1ade5-2993-45e2-a3c5-d3c64204ebc8 service nova] Acquired lock "refresh_cache-3a4668ee-e420-4ad8-b638-95b3d55e00c1" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 972.249731] env[60548]: DEBUG nova.network.neutron [req-697f7071-2b29-4b4a-908d-e4828a263cb2 req-fba1ade5-2993-45e2-a3c5-d3c64204ebc8 service nova] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Refreshing network info cache for port a00a8e60-5bdf-44fc-bc48-27c65f02a00c {{(pid=60548) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 972.535846] env[60548]: DEBUG nova.network.neutron [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Updated VIF entry in instance network info cache for port 8fd59865-b405-4979-a4d6-1dae122ce7c5. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 972.536186] env[60548]: DEBUG nova.network.neutron [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Updating instance_info_cache with network_info: [{"id": "8fd59865-b405-4979-a4d6-1dae122ce7c5", "address": "fa:16:3e:c7:e9:85", "network": {"id": "a0196fe2-ed4b-4afd-a480-0086ee6ae168", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-794474085-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4c0d2d521309468ba27acf66751f02f7", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "92e4d027-e755-417b-8eea-9a8f24b85140", "external-id": "nsx-vlan-transportzone-756", "segmentation_id": 756, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8fd59865-b4", "ovs_interfaceid": "8fd59865-b405-4979-a4d6-1dae122ce7c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 972.550055] env[60548]: DEBUG oslo_concurrency.lockutils [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] Releasing lock "refresh_cache-3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 972.550438] env[60548]: DEBUG nova.compute.manager [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Received event network-vif-plugged-a00a8e60-5bdf-44fc-bc48-27c65f02a00c {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 972.550632] env[60548]: DEBUG oslo_concurrency.lockutils [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] Acquiring lock "3a4668ee-e420-4ad8-b638-95b3d55e00c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 972.550829] env[60548]: DEBUG oslo_concurrency.lockutils [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] Lock "3a4668ee-e420-4ad8-b638-95b3d55e00c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 972.550991] env[60548]: DEBUG oslo_concurrency.lockutils [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] Lock "3a4668ee-e420-4ad8-b638-95b3d55e00c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 972.551168] env[60548]: DEBUG nova.compute.manager [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] No waiting events found dispatching network-vif-plugged-a00a8e60-5bdf-44fc-bc48-27c65f02a00c {{(pid=60548) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 972.551331] env[60548]: WARNING nova.compute.manager [req-e3509e9c-06b3-423c-8ba1-375baabab6cc req-bf3ee985-5390-44cd-a628-7bfeb19a7c7c service nova] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Received unexpected event network-vif-plugged-a00a8e60-5bdf-44fc-bc48-27c65f02a00c for instance with vm_state building and task_state spawning. [ 972.784065] env[60548]: DEBUG nova.network.neutron [req-697f7071-2b29-4b4a-908d-e4828a263cb2 req-fba1ade5-2993-45e2-a3c5-d3c64204ebc8 service nova] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Updated VIF entry in instance network info cache for port a00a8e60-5bdf-44fc-bc48-27c65f02a00c. {{(pid=60548) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 972.784447] env[60548]: DEBUG nova.network.neutron [req-697f7071-2b29-4b4a-908d-e4828a263cb2 req-fba1ade5-2993-45e2-a3c5-d3c64204ebc8 service nova] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Updating instance_info_cache with network_info: [{"id": "a00a8e60-5bdf-44fc-bc48-27c65f02a00c", "address": "fa:16:3e:99:ea:b7", "network": {"id": "542e1d67-cf01-44bc-93b3-0e0ae5507f08", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-273292017-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "42dfcefd33194f5fb1356f7abc1cccad", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "dd96b39f-bd2e-48d1-85c3-577cf97f08c8", "external-id": "cl2-zone-84", "segmentation_id": 84, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa00a8e60-5b", "ovs_interfaceid": "a00a8e60-5bdf-44fc-bc48-27c65f02a00c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 972.795641] env[60548]: DEBUG oslo_concurrency.lockutils [req-697f7071-2b29-4b4a-908d-e4828a263cb2 req-fba1ade5-2993-45e2-a3c5-d3c64204ebc8 service nova] Releasing lock "refresh_cache-3a4668ee-e420-4ad8-b638-95b3d55e00c1" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 990.175762] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 991.172000] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 991.172139] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 992.172459] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 992.172756] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60548) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 992.172814] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 992.184718] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 992.184938] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 992.185124] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 992.185281] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60548) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 992.186364] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff01a2ab-3bb1-453d-9129-b39f7ca60182 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 992.197162] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5448c7dc-ada8-4bcb-9aea-6ee7f4d653c8 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 992.211520] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-037252b0-b359-42b9-86d6-c91937cd0607 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 992.218266] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a0add7a-e996-4a70-a97f-3a1e38dc5988 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 992.247864] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180651MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60548) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 992.248037] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 992.248281] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 992.309292] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance ad98988d-92aa-4ace-8e40-cd316758002e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 992.309466] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 585e3015-faef-40df-b3dd-04d2c8e4dd00 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 992.309579] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance e3fd811a-186d-436f-bdef-a910a3ccd416 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 992.309756] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 8f447658-c66d-4d94-af30-fd43c83dae0e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 992.309842] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 979c5fe5-051f-4a43-be2f-571aad25a4ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 992.309924] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance e6466fbb-a225-4bbd-839b-f8c4b24d9860 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 992.310053] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 992.310175] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance d0d515a4-15ce-4276-b151-34a8a556a1df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 992.310327] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 3a4668ee-e420-4ad8-b638-95b3d55e00c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 992.310589] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 992.310744] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=100GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 992.425711] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff06bdcd-f812-4e90-9095-88acc798357d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 992.433658] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d442752-3d7b-443f-9f6b-136425ab786e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 992.464058] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74af28a7-5a55-4250-9b7e-defc5d3a9065 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 992.471449] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8519d759-d6f2-4fe3-aebe-7d30cf2c0cee {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 992.485196] env[60548]: DEBUG nova.compute.provider_tree [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 992.494229] env[60548]: DEBUG nova.scheduler.client.report [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 992.507786] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60548) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 992.507967] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.260s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 993.507790] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 994.167541] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 995.172390] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 995.172739] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Starting heal instance info cache {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 995.172739] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Rebuilding the list of instances to heal {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 995.192721] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 995.193165] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 995.194046] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 995.194046] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 995.194046] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 995.194046] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 995.194046] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 995.194046] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 995.194301] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 995.194301] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Didn't find any instances for network info cache update. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 996.172035] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 996.192961] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1012.098641] env[60548]: WARNING oslo_vmware.rw_handles [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1012.098641] env[60548]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1012.098641] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1012.098641] env[60548]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1012.098641] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1012.098641] env[60548]: ERROR oslo_vmware.rw_handles response.begin() [ 1012.098641] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1012.098641] env[60548]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1012.098641] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1012.098641] env[60548]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1012.098641] env[60548]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1012.098641] env[60548]: ERROR oslo_vmware.rw_handles [ 1012.099484] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Downloaded image file data 5674e50f-0c0c-4f19-8379-104dac34660b to vmware_temp/ec2ae176-f15e-432a-a067-d974c29ad327/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1012.100900] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Caching image {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1012.101163] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Copying Virtual Disk [datastore1] vmware_temp/ec2ae176-f15e-432a-a067-d974c29ad327/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk to [datastore1] vmware_temp/ec2ae176-f15e-432a-a067-d974c29ad327/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk {{(pid=60548) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1012.101448] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-80292f41-bbf0-426a-b7bc-e46bae6492c3 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1012.110310] env[60548]: DEBUG oslo_vmware.api [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Waiting for the task: (returnval){ [ 1012.110310] env[60548]: value = "task-4323415" [ 1012.110310] env[60548]: _type = "Task" [ 1012.110310] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1012.118639] env[60548]: DEBUG oslo_vmware.api [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Task: {'id': task-4323415, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1012.622621] env[60548]: DEBUG oslo_vmware.exceptions [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Fault InvalidArgument not matched. {{(pid=60548) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1012.622621] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1012.622621] env[60548]: ERROR nova.compute.manager [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1012.622621] env[60548]: Faults: ['InvalidArgument'] [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Traceback (most recent call last): [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] yield resources [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] self.driver.spawn(context, instance, image_meta, [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] self._fetch_image_if_missing(context, vi) [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] image_cache(vi, tmp_image_ds_loc) [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] vm_util.copy_virtual_disk( [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] session._wait_for_task(vmdk_copy_task) [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] return self.wait_for_task(task_ref) [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] return evt.wait() [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] result = hub.switch() [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] return self.greenlet.switch() [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] self.f(*self.args, **self.kw) [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] raise exceptions.translate_fault(task_info.error) [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Faults: ['InvalidArgument'] [ 1012.622621] env[60548]: ERROR nova.compute.manager [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] [ 1012.622621] env[60548]: INFO nova.compute.manager [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Terminating instance [ 1012.624438] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1012.624718] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1012.625399] env[60548]: DEBUG nova.compute.manager [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1012.625668] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1012.625953] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-455cd113-51cc-40d9-a5df-5bc4f7b94108 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1012.628443] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6cb4354-39e7-46af-b987-fc5ff969eb75 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1012.636390] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1012.636731] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4122dbe1-e42e-4590-a2cb-6353d4209dea {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1012.639162] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1012.639439] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1012.640455] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6f0d19a2-72e9-4299-a4c5-aa71b1f23356 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1012.645565] env[60548]: DEBUG oslo_vmware.api [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Waiting for the task: (returnval){ [ 1012.645565] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52761a14-0acb-23cd-636c-527a56e24e77" [ 1012.645565] env[60548]: _type = "Task" [ 1012.645565] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1012.654552] env[60548]: DEBUG oslo_vmware.api [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52761a14-0acb-23cd-636c-527a56e24e77, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1012.703164] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1012.703448] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1012.703745] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Deleting the datastore file [datastore1] 30cf201d-7a1c-479c-9040-fba38726d9ab {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1012.704087] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1c1f001d-0ecd-48c4-9fbb-7a3756705fcf {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1012.710835] env[60548]: DEBUG oslo_vmware.api [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Waiting for the task: (returnval){ [ 1012.710835] env[60548]: value = "task-4323417" [ 1012.710835] env[60548]: _type = "Task" [ 1012.710835] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1012.719155] env[60548]: DEBUG oslo_vmware.api [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Task: {'id': task-4323417, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1013.156396] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1013.156712] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Creating directory with path [datastore1] vmware_temp/b94744f6-4923-44c6-bde9-0be752a5851d/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1013.156929] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4a86d646-a4ec-4e58-945b-e42dc3242e7b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1013.169996] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Created directory with path [datastore1] vmware_temp/b94744f6-4923-44c6-bde9-0be752a5851d/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1013.169996] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Fetch image to [datastore1] vmware_temp/b94744f6-4923-44c6-bde9-0be752a5851d/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1013.169996] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/b94744f6-4923-44c6-bde9-0be752a5851d/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1013.170422] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-847838f2-0d3d-4504-98b5-2d4461ebfd68 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1013.177462] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c2dc5e6-ff80-43c0-a8be-3cb01ce7cce8 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1013.187678] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff2a9584-51ce-4810-b5da-8d8d42f6da24 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1013.221747] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab36c2ea-1734-43dc-90d9-04ea248f5b9b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1013.229718] env[60548]: DEBUG oslo_vmware.api [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Task: {'id': task-4323417, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074969} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1013.231328] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1013.232009] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1013.232277] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1013.232471] env[60548]: INFO nova.compute.manager [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1013.234304] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5d5a6adb-61c2-4fe4-80ee-b93ef20dc1c5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1013.236373] env[60548]: DEBUG nova.compute.claims [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1013.236545] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1013.236754] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1013.262862] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1013.262862] env[60548]: DEBUG nova.compute.utils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Instance 30cf201d-7a1c-479c-9040-fba38726d9ab could not be found. {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1013.264919] env[60548]: DEBUG nova.compute.manager [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Instance disappeared during build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1013.265110] env[60548]: DEBUG nova.compute.manager [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1013.265277] env[60548]: DEBUG nova.compute.manager [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1013.265428] env[60548]: DEBUG nova.compute.manager [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1013.265584] env[60548]: DEBUG nova.network.neutron [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1013.292055] env[60548]: DEBUG nova.network.neutron [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1013.300858] env[60548]: INFO nova.compute.manager [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Took 0.03 seconds to deallocate network for instance. [ 1013.325507] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1013.342503] env[60548]: DEBUG oslo_concurrency.lockutils [None req-9bf5722c-1236-4b33-8006-6c056614262c tempest-DeleteServersAdminTestJSON-523328319 tempest-DeleteServersAdminTestJSON-523328319-project-member] Lock "30cf201d-7a1c-479c-9040-fba38726d9ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 257.541s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1013.342995] env[60548]: DEBUG oslo_concurrency.lockutils [req-ee580dce-7fcd-42d7-acc7-b70eb2d1a8a5 req-0e447284-8d1b-4553-a189-36809cf3db37 service nova] Acquired lock "30cf201d-7a1c-479c-9040-fba38726d9ab" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1013.343833] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6b2e433-38ca-4d98-b0a0-7e837b8640e5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1013.357517] env[60548]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 1013.357517] env[60548]: DEBUG oslo_vmware.api [-] Fault list: [ManagedObjectNotFound] {{(pid=60548) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 1013.357806] env[60548]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-dc30cc0f-e9f7-4e96-a903-21b646eeb787 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1013.366705] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84990784-5c13-4bca-b971-5dd4045acf5a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1013.377822] env[60548]: DEBUG oslo_vmware.rw_handles [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b94744f6-4923-44c6-bde9-0be752a5851d/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1013.436023] env[60548]: DEBUG oslo_vmware.rw_handles [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Completed reading data from the image iterator. {{(pid=60548) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1013.436235] env[60548]: DEBUG oslo_vmware.rw_handles [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b94744f6-4923-44c6-bde9-0be752a5851d/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1013.451823] env[60548]: ERROR root [req-ee580dce-7fcd-42d7-acc7-b70eb2d1a8a5 req-0e447284-8d1b-4553-a189-36809cf3db37 service nova] Original exception being dropped: ['Traceback (most recent call last):\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py", line 377, in request_handler\n response = request(managed_object, **kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/suds/client.py", line 586, in __call__\n return client.invoke(args, kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/suds/client.py", line 728, in invoke\n result = self.send(soapenv, timeout=timeout)\n', ' File "/usr/local/lib/python3.10/dist-packages/suds/client.py", line 777, in send\n return self.process_reply(reply.message, None, None)\n', ' File "/usr/local/lib/python3.10/dist-packages/suds/client.py", line 840, in process_reply\n raise WebFault(fault, replyroot)\n', "suds.WebFault: Server raised fault: 'The object 'vim.VirtualMachine:vm-850345' has already been deleted or has not been completely created'\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 301, in _invoke_api\n return api_method(*args, **kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/vim_util.py", line 480, in get_object_property\n props = get_object_properties(vim, moref, [property_name],\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/vim_util.py", line 360, in get_object_properties\n retrieve_result = vim.RetrievePropertiesEx(\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py", line 413, in request_handler\n raise exceptions.VimFaultException(fault_list, fault_string,\n', "oslo_vmware.exceptions.VimFaultException: The object 'vim.VirtualMachine:vm-850345' has already been deleted or has not been completely created\nCause: Server raised fault: 'The object 'vim.VirtualMachine:vm-850345' has already been deleted or has not been completely created'\nFaults: [ManagedObjectNotFound]\nDetails: {'obj': 'vm-850345'}\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 123, in _call_method\n return self.invoke_api(module, method, self.vim, *args,\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 358, in invoke_api\n return _invoke_api(module, method, *args, **kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 122, in func\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 122, in _inner\n idle = self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 96, in _func\n result = f(*args, **kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 341, in _invoke_api\n raise clazz(str(excep),\n', "oslo_vmware.exceptions.ManagedObjectNotFoundException: The object 'vim.VirtualMachine:vm-850345' has already been deleted or has not been completely created\nCause: Server raised fault: 'The object 'vim.VirtualMachine:vm-850345' has already been deleted or has not been completely created'\nFaults: [ManagedObjectNotFound]\nDetails: {'obj': 'vm-850345'}\n"]: nova.exception.InstanceNotFound: Instance 30cf201d-7a1c-479c-9040-fba38726d9ab could not be found. [ 1013.452065] env[60548]: DEBUG oslo_concurrency.lockutils [req-ee580dce-7fcd-42d7-acc7-b70eb2d1a8a5 req-0e447284-8d1b-4553-a189-36809cf3db37 service nova] Releasing lock "30cf201d-7a1c-479c-9040-fba38726d9ab" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1013.452299] env[60548]: DEBUG nova.compute.manager [req-ee580dce-7fcd-42d7-acc7-b70eb2d1a8a5 req-0e447284-8d1b-4553-a189-36809cf3db37 service nova] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Detach interface failed, port_id=5274de6d-19ff-4581-a3b2-246b42ce746a, reason: Instance 30cf201d-7a1c-479c-9040-fba38726d9ab could not be found. {{(pid=60548) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10838}} [ 1050.175777] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1051.171678] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1053.171687] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1053.172104] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1053.172104] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1053.183877] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1053.184111] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1053.184277] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1053.184437] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60548) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1053.185654] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1c7bf6e-e2c7-4b60-98f6-bd8500723dd1 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.194709] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2100deea-874f-40d8-b6ce-7edbbfbd5de2 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.209572] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6274c565-4bc2-4117-b77e-72f754f7c8ef {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.215593] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9366c10-2a95-43fd-abad-cf62b45c06db {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.246124] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180701MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60548) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1053.246315] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1053.246471] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1053.309863] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance ad98988d-92aa-4ace-8e40-cd316758002e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1053.310103] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 585e3015-faef-40df-b3dd-04d2c8e4dd00 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1053.310269] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance e3fd811a-186d-436f-bdef-a910a3ccd416 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1053.310407] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 8f447658-c66d-4d94-af30-fd43c83dae0e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1053.310531] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 979c5fe5-051f-4a43-be2f-571aad25a4ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1053.310651] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance e6466fbb-a225-4bbd-839b-f8c4b24d9860 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1053.310769] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1053.310886] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance d0d515a4-15ce-4276-b151-34a8a556a1df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1053.311016] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Instance 3a4668ee-e420-4ad8-b638-95b3d55e00c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60548) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1053.311214] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1053.311350] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=100GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1053.421321] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4f1bf2a-586e-4a84-a805-df01f26d1072 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.429727] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f299811e-0c4e-41f2-9111-153c984a636d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.459533] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc0a3b27-5b1f-41ef-a935-1005ff4c7665 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.467475] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c19b600-e93b-489a-83e2-f7c2d49b0b55 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1053.481840] env[60548]: DEBUG nova.compute.provider_tree [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1053.492138] env[60548]: DEBUG nova.scheduler.client.report [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1053.505078] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60548) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1053.505264] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.259s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1054.505156] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1054.505443] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60548) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1056.167757] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1056.171264] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1056.171400] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Starting heal instance info cache {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1056.171520] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Rebuilding the list of instances to heal {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1056.190844] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1056.191035] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1056.191172] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1056.191299] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1056.191423] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1056.191544] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1056.191663] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1056.191780] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1056.191896] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Skipping network cache update for instance because it is Building. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1056.192019] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Didn't find any instances for network info cache update. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1058.171749] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1061.136788] env[60548]: WARNING oslo_vmware.rw_handles [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1061.136788] env[60548]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1061.136788] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1061.136788] env[60548]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1061.136788] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1061.136788] env[60548]: ERROR oslo_vmware.rw_handles response.begin() [ 1061.136788] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1061.136788] env[60548]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1061.136788] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1061.136788] env[60548]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1061.136788] env[60548]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1061.136788] env[60548]: ERROR oslo_vmware.rw_handles [ 1061.136788] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Downloaded image file data 5674e50f-0c0c-4f19-8379-104dac34660b to vmware_temp/b94744f6-4923-44c6-bde9-0be752a5851d/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1061.138989] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Caching image {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1061.139272] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Copying Virtual Disk [datastore1] vmware_temp/b94744f6-4923-44c6-bde9-0be752a5851d/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk to [datastore1] vmware_temp/b94744f6-4923-44c6-bde9-0be752a5851d/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk {{(pid=60548) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1061.139578] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-036bb7bd-7b88-4fa6-ba86-d3f838ccf51b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1061.149237] env[60548]: DEBUG oslo_vmware.api [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Waiting for the task: (returnval){ [ 1061.149237] env[60548]: value = "task-4323418" [ 1061.149237] env[60548]: _type = "Task" [ 1061.149237] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1061.158211] env[60548]: DEBUG oslo_vmware.api [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Task: {'id': task-4323418, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1061.659726] env[60548]: DEBUG oslo_vmware.exceptions [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Fault InvalidArgument not matched. {{(pid=60548) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1061.659909] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1061.660443] env[60548]: ERROR nova.compute.manager [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1061.660443] env[60548]: Faults: ['InvalidArgument'] [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Traceback (most recent call last): [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] yield resources [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] self.driver.spawn(context, instance, image_meta, [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] self._fetch_image_if_missing(context, vi) [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] image_cache(vi, tmp_image_ds_loc) [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] vm_util.copy_virtual_disk( [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] session._wait_for_task(vmdk_copy_task) [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] return self.wait_for_task(task_ref) [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] return evt.wait() [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] result = hub.switch() [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] return self.greenlet.switch() [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] self.f(*self.args, **self.kw) [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] raise exceptions.translate_fault(task_info.error) [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Faults: ['InvalidArgument'] [ 1061.660443] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] [ 1061.661426] env[60548]: INFO nova.compute.manager [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Terminating instance [ 1061.662417] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1061.662639] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1061.662870] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6465af0f-6b2c-4b89-9231-b8a66ad0e8f8 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1061.665312] env[60548]: DEBUG nova.compute.manager [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1061.665507] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1061.666228] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee1f58ae-c362-4ec4-a5c8-c34d9a274c16 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1061.673413] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1061.673627] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-12e01ce3-cb4f-470f-a17a-934411c5d072 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1061.675765] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1061.675942] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1061.676944] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-25422259-3188-482e-8c05-1059ea2e72f8 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1061.682182] env[60548]: DEBUG oslo_vmware.api [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Waiting for the task: (returnval){ [ 1061.682182] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52fa2371-6f7d-97c1-ce82-a3757c784b47" [ 1061.682182] env[60548]: _type = "Task" [ 1061.682182] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1061.692115] env[60548]: DEBUG oslo_vmware.api [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]52fa2371-6f7d-97c1-ce82-a3757c784b47, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1061.744185] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1061.744441] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1061.744595] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Deleting the datastore file [datastore1] 585e3015-faef-40df-b3dd-04d2c8e4dd00 {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1061.744862] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4294c913-7eac-4f92-a188-274c358986f2 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1061.752081] env[60548]: DEBUG oslo_vmware.api [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Waiting for the task: (returnval){ [ 1061.752081] env[60548]: value = "task-4323420" [ 1061.752081] env[60548]: _type = "Task" [ 1061.752081] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1061.760579] env[60548]: DEBUG oslo_vmware.api [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Task: {'id': task-4323420, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1062.193107] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1062.193417] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Creating directory with path [datastore1] vmware_temp/c3436550-9073-41b9-9217-7d1bff8312d9/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1062.193616] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-581a51f6-ce5a-4d77-a486-d61028429c19 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1062.206494] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Created directory with path [datastore1] vmware_temp/c3436550-9073-41b9-9217-7d1bff8312d9/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1062.206699] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Fetch image to [datastore1] vmware_temp/c3436550-9073-41b9-9217-7d1bff8312d9/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1062.206871] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/c3436550-9073-41b9-9217-7d1bff8312d9/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1062.207641] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90bd7106-023f-4912-b78a-88f088676655 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1062.215114] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12a1ff31-1dc7-4bcc-b27d-b28d1faf6ae4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1062.224461] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7c7a59d-9fbd-492a-a5a5-4204147d3610 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1062.259844] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42f023b3-271b-4881-9a52-9a3a70ddd17a {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1062.269015] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c0910e90-9491-4064-bfc0-4a67478e78ac {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1062.270894] env[60548]: DEBUG oslo_vmware.api [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Task: {'id': task-4323420, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081374} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1062.271196] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1062.271351] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1062.271525] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1062.271690] env[60548]: INFO nova.compute.manager [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1062.273864] env[60548]: DEBUG nova.compute.claims [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1062.274046] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1062.274258] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1062.296162] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1062.343015] env[60548]: DEBUG oslo_vmware.rw_handles [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c3436550-9073-41b9-9217-7d1bff8312d9/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1062.401189] env[60548]: DEBUG oslo_vmware.rw_handles [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Completed reading data from the image iterator. {{(pid=60548) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1062.401407] env[60548]: DEBUG oslo_vmware.rw_handles [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c3436550-9073-41b9-9217-7d1bff8312d9/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1062.480089] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c770071f-8ecd-48b1-babe-87f977434c4f {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1062.488981] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19110d27-d478-4432-a346-bd10374cbd36 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1062.518602] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad778050-a9e8-4cda-9b2d-8aa4fa2ba6aa {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1062.525905] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e37b4da2-8998-44d8-8ad9-a5b4ba58d460 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1062.539412] env[60548]: DEBUG nova.compute.provider_tree [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1062.548260] env[60548]: DEBUG nova.scheduler.client.report [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1062.567023] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.291s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1062.567023] env[60548]: ERROR nova.compute.manager [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1062.567023] env[60548]: Faults: ['InvalidArgument'] [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Traceback (most recent call last): [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] self.driver.spawn(context, instance, image_meta, [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] self._fetch_image_if_missing(context, vi) [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] image_cache(vi, tmp_image_ds_loc) [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] vm_util.copy_virtual_disk( [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] session._wait_for_task(vmdk_copy_task) [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] return self.wait_for_task(task_ref) [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] return evt.wait() [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] result = hub.switch() [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] return self.greenlet.switch() [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] self.f(*self.args, **self.kw) [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] raise exceptions.translate_fault(task_info.error) [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Faults: ['InvalidArgument'] [ 1062.567023] env[60548]: ERROR nova.compute.manager [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] [ 1062.567023] env[60548]: DEBUG nova.compute.utils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] VimFaultException {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1062.568564] env[60548]: DEBUG nova.compute.manager [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Build of instance 585e3015-faef-40df-b3dd-04d2c8e4dd00 was re-scheduled: A specified parameter was not correct: fileType [ 1062.568564] env[60548]: Faults: ['InvalidArgument'] {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1062.569025] env[60548]: DEBUG nova.compute.manager [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1062.569208] env[60548]: DEBUG nova.compute.manager [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1062.569375] env[60548]: DEBUG nova.compute.manager [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1062.569534] env[60548]: DEBUG nova.network.neutron [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1062.863734] env[60548]: DEBUG nova.network.neutron [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1062.874941] env[60548]: INFO nova.compute.manager [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] [instance: 585e3015-faef-40df-b3dd-04d2c8e4dd00] Took 0.31 seconds to deallocate network for instance. [ 1063.657770] env[60548]: INFO nova.scheduler.client.report [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Deleted allocations for instance 585e3015-faef-40df-b3dd-04d2c8e4dd00 [ 1063.677814] env[60548]: DEBUG oslo_concurrency.lockutils [None req-ba991eed-553c-44e0-b30c-89e88793c3a2 tempest-ServersNegativeTestJSON-1737616329 tempest-ServersNegativeTestJSON-1737616329-project-member] Lock "585e3015-faef-40df-b3dd-04d2c8e4dd00" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 185.305s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1072.380780] env[60548]: DEBUG nova.compute.manager [req-98145285-b76a-4335-87e2-9dedf16f6a5c req-f1a2b634-d9f0-436d-bc9a-a477a6c15fc6 service nova] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Received event network-vif-deleted-59582163-0304-4f82-9a45-39db58047179 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1078.008435] env[60548]: DEBUG nova.compute.manager [req-f01b8e3a-343c-4e37-bf79-3d3b2a34b45c req-9ed2d0bc-50d2-42bf-8df0-51472ba0606f service nova] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Received event network-vif-deleted-1c9d862d-8a0d-4c91-834b-72d52fd1fcb7 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1080.050081] env[60548]: DEBUG nova.compute.manager [req-3140ab69-19a8-4553-a52b-00a689c3d324 req-0b5f18fb-4ff6-40b9-aba8-191356ee7954 service nova] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Received event network-vif-deleted-6ac0f8a3-c5bf-4a38-8a33-9cafae07546f {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1080.050081] env[60548]: DEBUG nova.compute.manager [req-3140ab69-19a8-4553-a52b-00a689c3d324 req-0b5f18fb-4ff6-40b9-aba8-191356ee7954 service nova] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Received event network-vif-deleted-c95ecd04-2d9a-4470-a5bf-7b4eebae8e2b {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1080.050081] env[60548]: DEBUG nova.compute.manager [req-3140ab69-19a8-4553-a52b-00a689c3d324 req-0b5f18fb-4ff6-40b9-aba8-191356ee7954 service nova] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Received event network-vif-deleted-0fecdaa6-f624-45a1-b74d-c1443e12532d {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1082.365752] env[60548]: DEBUG nova.compute.manager [req-b4ff5add-b329-4525-b84a-930719f9fab1 req-b5a3bc2c-2d54-4267-8bc6-bde01f92630e service nova] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Received event network-vif-deleted-8fd59865-b405-4979-a4d6-1dae122ce7c5 {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1082.366405] env[60548]: DEBUG nova.compute.manager [req-b4ff5add-b329-4525-b84a-930719f9fab1 req-b5a3bc2c-2d54-4267-8bc6-bde01f92630e service nova] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Received event network-vif-deleted-beafefc3-0ed1-4ee5-9e95-b8befdfc339d {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1106.172973] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1109.038834] env[60548]: DEBUG nova.compute.manager [req-e786ede6-c369-4c4e-8b7c-25d3d405f816 req-8dfe6652-5e91-4366-a645-9d54b171de90 service nova] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Received event network-vif-deleted-a00a8e60-5bdf-44fc-bc48-27c65f02a00c {{(pid=60548) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1111.180067] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1111.569720] env[60548]: WARNING oslo_vmware.rw_handles [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1111.569720] env[60548]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1111.569720] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1111.569720] env[60548]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1111.569720] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1111.569720] env[60548]: ERROR oslo_vmware.rw_handles response.begin() [ 1111.569720] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1111.569720] env[60548]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1111.569720] env[60548]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1111.569720] env[60548]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1111.569720] env[60548]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1111.569720] env[60548]: ERROR oslo_vmware.rw_handles [ 1111.570541] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Downloaded image file data 5674e50f-0c0c-4f19-8379-104dac34660b to vmware_temp/c3436550-9073-41b9-9217-7d1bff8312d9/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1111.572479] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Caching image {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1111.572772] env[60548]: DEBUG nova.virt.vmwareapi.vm_util [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Copying Virtual Disk [datastore1] vmware_temp/c3436550-9073-41b9-9217-7d1bff8312d9/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk to [datastore1] vmware_temp/c3436550-9073-41b9-9217-7d1bff8312d9/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk {{(pid=60548) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1111.573076] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d3776f61-b6ab-4572-bafe-00a1b3a71654 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1111.581333] env[60548]: DEBUG oslo_vmware.api [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Waiting for the task: (returnval){ [ 1111.581333] env[60548]: value = "task-4323421" [ 1111.581333] env[60548]: _type = "Task" [ 1111.581333] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1111.589828] env[60548]: DEBUG oslo_vmware.api [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Task: {'id': task-4323421, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1112.092104] env[60548]: DEBUG oslo_vmware.exceptions [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Fault InvalidArgument not matched. {{(pid=60548) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1112.092332] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Releasing lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1112.092893] env[60548]: ERROR nova.compute.manager [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1112.092893] env[60548]: Faults: ['InvalidArgument'] [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Traceback (most recent call last): [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] yield resources [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] self.driver.spawn(context, instance, image_meta, [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] self._fetch_image_if_missing(context, vi) [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] image_cache(vi, tmp_image_ds_loc) [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] vm_util.copy_virtual_disk( [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] session._wait_for_task(vmdk_copy_task) [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] return self.wait_for_task(task_ref) [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] return evt.wait() [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] result = hub.switch() [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] return self.greenlet.switch() [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] self.f(*self.args, **self.kw) [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] raise exceptions.translate_fault(task_info.error) [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Faults: ['InvalidArgument'] [ 1112.092893] env[60548]: ERROR nova.compute.manager [instance: ad98988d-92aa-4ace-8e40-cd316758002e] [ 1112.093895] env[60548]: INFO nova.compute.manager [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Terminating instance [ 1112.094789] env[60548]: DEBUG oslo_concurrency.lockutils [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Acquired lock "[datastore1] devstack-image-cache_base/5674e50f-0c0c-4f19-8379-104dac34660b/5674e50f-0c0c-4f19-8379-104dac34660b.vmdk" {{(pid=60548) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1112.095132] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1112.095257] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-631faa44-080f-42f8-b6b8-5e90ad276309 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1112.097450] env[60548]: DEBUG nova.compute.manager [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Start destroying the instance on the hypervisor. {{(pid=60548) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1112.097641] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Destroying instance {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1112.098360] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2dd09b65-e612-44e8-9cd8-3077391654d4 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1112.105485] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Unregistering the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1112.105684] env[60548]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6027b1f7-583a-4e11-b4e2-d1b9f09e7758 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1112.108015] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1112.108194] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=60548) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1112.109292] env[60548]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7106d7a5-0cc3-40cd-82d6-126066ee7525 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1112.114222] env[60548]: DEBUG oslo_vmware.api [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Waiting for the task: (returnval){ [ 1112.114222] env[60548]: value = "session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]529fac4c-e1ec-cb0b-fd28-8c8325e12496" [ 1112.114222] env[60548]: _type = "Task" [ 1112.114222] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1112.123284] env[60548]: DEBUG oslo_vmware.api [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Task: {'id': session[524fc3e2-4ac6-ca6c-a20e-cce293b94246]529fac4c-e1ec-cb0b-fd28-8c8325e12496, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1112.172013] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1112.179339] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Unregistered the VM {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1112.179684] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Deleting contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1112.179901] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Deleting the datastore file [datastore1] ad98988d-92aa-4ace-8e40-cd316758002e {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1112.180247] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4f223ba7-f5d1-46b9-a2d0-e25cbcc1cd08 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1112.187770] env[60548]: DEBUG oslo_vmware.api [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Waiting for the task: (returnval){ [ 1112.187770] env[60548]: value = "task-4323423" [ 1112.187770] env[60548]: _type = "Task" [ 1112.187770] env[60548]: } to complete. {{(pid=60548) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1112.196414] env[60548]: DEBUG oslo_vmware.api [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Task: {'id': task-4323423, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1112.624801] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Preparing fetch location {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1112.625086] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Creating directory with path [datastore1] vmware_temp/12cdc2f0-2207-4bff-91a8-54532814590f/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1112.625321] env[60548]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d4b8df03-37f8-4749-853f-fb45a54016b2 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1112.637788] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Created directory with path [datastore1] vmware_temp/12cdc2f0-2207-4bff-91a8-54532814590f/5674e50f-0c0c-4f19-8379-104dac34660b {{(pid=60548) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1112.637987] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Fetch image to [datastore1] vmware_temp/12cdc2f0-2207-4bff-91a8-54532814590f/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk {{(pid=60548) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1112.638154] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to [datastore1] vmware_temp/12cdc2f0-2207-4bff-91a8-54532814590f/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk on the data store datastore1 {{(pid=60548) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1112.638891] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e02bd3ab-a41e-484c-97c9-c23b62f9a80b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1112.645891] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-641e7e9f-c182-4b2b-8f37-b590dac8aa3b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1112.655576] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4b232ba-e23f-4b2a-b018-1d99d35abee5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1112.688100] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88e68573-fae6-4b03-8b74-43c0ab695094 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1112.699900] env[60548]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2f3e42b2-a552-43c7-b4e1-76e665af670d {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1112.701599] env[60548]: DEBUG oslo_vmware.api [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Task: {'id': task-4323423, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072629} completed successfully. {{(pid=60548) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1112.701826] env[60548]: DEBUG nova.virt.vmwareapi.ds_util [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Deleted the datastore file {{(pid=60548) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1112.701996] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Deleted contents of the VM from datastore datastore1 {{(pid=60548) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1112.702173] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Instance destroyed {{(pid=60548) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1112.702374] env[60548]: INFO nova.compute.manager [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1112.704482] env[60548]: DEBUG nova.compute.claims [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Aborting claim: {{(pid=60548) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1112.704644] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1112.704993] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1112.724810] env[60548]: DEBUG nova.virt.vmwareapi.images [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Downloading image file data 5674e50f-0c0c-4f19-8379-104dac34660b to the data store datastore1 {{(pid=60548) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1112.731045] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1112.731692] env[60548]: DEBUG nova.compute.utils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Instance ad98988d-92aa-4ace-8e40-cd316758002e could not be found. {{(pid=60548) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1112.733257] env[60548]: DEBUG nova.compute.manager [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Instance disappeared during build. {{(pid=60548) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1112.733410] env[60548]: DEBUG nova.compute.manager [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Unplugging VIFs for instance {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1112.733564] env[60548]: DEBUG nova.compute.manager [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60548) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1112.733729] env[60548]: DEBUG nova.compute.manager [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Deallocating network for instance {{(pid=60548) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1112.733876] env[60548]: DEBUG nova.network.neutron [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] deallocate_for_instance() {{(pid=60548) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1112.758557] env[60548]: DEBUG nova.network.neutron [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Updating instance_info_cache with network_info: [] {{(pid=60548) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1112.767530] env[60548]: INFO nova.compute.manager [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Took 0.03 seconds to deallocate network for instance. [ 1112.774960] env[60548]: DEBUG oslo_vmware.rw_handles [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/12cdc2f0-2207-4bff-91a8-54532814590f/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1112.834311] env[60548]: DEBUG oslo_vmware.rw_handles [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Completed reading data from the image iterator. {{(pid=60548) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1112.834496] env[60548]: DEBUG oslo_vmware.rw_handles [None req-7667719b-d156-4a33-8877-e4d01d2f86dc tempest-MultipleCreateTestJSON-535747290 tempest-MultipleCreateTestJSON-535747290-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/12cdc2f0-2207-4bff-91a8-54532814590f/5674e50f-0c0c-4f19-8379-104dac34660b/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=60548) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1112.942278] env[60548]: DEBUG oslo_concurrency.lockutils [None req-c07ba2b1-1566-4f59-a77f-855deef06699 tempest-ServerDiskConfigTestJSON-1556678235 tempest-ServerDiskConfigTestJSON-1556678235-project-member] Lock "ad98988d-92aa-4ace-8e40-cd316758002e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.009s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1113.171160] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1113.171377] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1113.182347] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1113.182697] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1113.182697] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1113.182859] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60548) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1113.183936] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6140d932-33e6-456a-b523-09472847af2e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1113.192697] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fc6fbcb-91bf-4369-acb8-2b7a8f72c1c5 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1113.207116] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c340763-98a9-46da-baaa-8ee9392c4847 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1113.213949] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fc76ef4-191a-458d-90d4-b5c34ef30bfa {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1113.243599] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180668MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60548) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1113.243751] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1113.243941] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1113.274935] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1113.275129] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=100GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=60548) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1113.290929] env[60548]: DEBUG nova.scheduler.client.report [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Refreshing inventories for resource provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1113.304423] env[60548]: DEBUG nova.scheduler.client.report [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Updating ProviderTree inventory for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1113.304627] env[60548]: DEBUG nova.compute.provider_tree [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Updating inventory in ProviderTree for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1113.314854] env[60548]: DEBUG nova.scheduler.client.report [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Refreshing aggregate associations for resource provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64, aggregates: None {{(pid=60548) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1113.330911] env[60548]: DEBUG nova.scheduler.client.report [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Refreshing trait associations for resource provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=60548) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1113.343053] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85ecd08a-c054-4c9e-b321-7d2c6766cf4e {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1113.350542] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82d8f574-8f1f-4fc5-9528-72922a08ec0b {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1113.380283] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aca49d92-d94a-4af1-91ae-c0f60d0141b7 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1113.388323] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14111cb5-07da-45c8-8253-b0fb234a2709 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1113.402024] env[60548]: DEBUG nova.compute.provider_tree [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed in ProviderTree for provider: 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 {{(pid=60548) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1113.410913] env[60548]: DEBUG nova.scheduler.client.report [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Inventory has not changed for provider 3c0a58fa-f44f-43ae-bee7-c3032edaaa64 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60548) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1113.426091] env[60548]: DEBUG nova.compute.resource_tracker [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60548) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1113.426251] env[60548]: DEBUG oslo_concurrency.lockutils [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.182s {{(pid=60548) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1113.426447] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1113.426585] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Cleaning up deleted instances with incomplete migration {{(pid=60548) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 1114.433033] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1114.433486] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1114.433486] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60548) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1116.172030] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1116.172459] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Starting heal instance info cache {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1116.172459] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Rebuilding the list of instances to heal {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1116.182962] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Didn't find any instances for network info cache update. {{(pid=60548) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1117.172047] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1117.172453] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Cleaning up deleted instances {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 1117.224422] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] There are 18 instances to clean {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 1117.224610] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 3a4668ee-e420-4ad8-b638-95b3d55e00c1] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1117.260714] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: d0d515a4-15ce-4276-b151-34a8a556a1df] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1117.285837] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 3d0fbb3e-8b84-4ba9-bb7a-e679b6e2ed04] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1117.307814] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: e6466fbb-a225-4bbd-839b-f8c4b24d9860] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1117.328078] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 979c5fe5-051f-4a43-be2f-571aad25a4ae] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1117.350180] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 8f447658-c66d-4d94-af30-fd43c83dae0e] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1117.370730] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: e3fd811a-186d-436f-bdef-a910a3ccd416] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1117.391021] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: ad98988d-92aa-4ace-8e40-cd316758002e] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1117.411383] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 30cf201d-7a1c-479c-9040-fba38726d9ab] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1117.431458] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 6774e2f5-99d0-4dc9-9ac0-188b35bd68a2] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1117.452425] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 67cbaf5c-e743-4e07-8f74-c51e4f57914d] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1117.473469] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 46737200-2da8-41ee-b33e-3bb6cc3e4618] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1117.496577] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 306f3cb9-3028-4ff2-8090-2c9c1c72efc1] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1117.517055] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: be11788c-634f-40c0-8c8c-d6253d0e68ad] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1117.537234] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 2f252e1d-bd99-4d70-a8ab-b3f9ce8ab9b5] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1117.557884] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: afb2cdc1-74ec-4d08-85cb-e96b4071f661] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1117.578223] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 83ecd8bb-ba2b-4151-986b-26f50b54e8e2] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1117.602273] env[60548]: DEBUG nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] [instance: 2751bdfb-2f28-48e0-98c2-f232ed6da6df] Instance has had 0 of 5 cleanup attempts {{(pid=60548) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1118.619274] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1119.167466] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1119.179450] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1132.563851] env[60548]: DEBUG oslo_service.periodic_task [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60548) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1132.572599] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Getting list of instances from cluster (obj){ [ 1132.572599] env[60548]: value = "domain-c8" [ 1132.572599] env[60548]: _type = "ClusterComputeResource" [ 1132.572599] env[60548]: } {{(pid=60548) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1132.573976] env[60548]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca2d46e1-8a1b-418f-a6ee-ddb5641b36b2 {{(pid=60548) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1132.589302] env[60548]: DEBUG nova.virt.vmwareapi.vmops [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] Got total of 7 instances {{(pid=60548) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1132.589465] env[60548]: WARNING nova.compute.manager [None req-d89ff007-4e98-46f1-adaf-75f7446cf693 None None] While synchronizing instance power states, found 0 instances in the database and 7 instances on the hypervisor.