[ 512.992819] env[59975]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 513.628489] env[60024]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 515.197211] env[60024]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=60024) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 515.197652] env[60024]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=60024) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 515.197652] env[60024]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=60024) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 515.197995] env[60024]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 515.199139] env[60024]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 515.320096] env[60024]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=60024) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 515.331205] env[60024]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.011s {{(pid=60024) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 515.567507] env[60024]: INFO nova.virt.driver [None req-c5354712-db8d-46f2-b61c-d02679c3988f None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 515.644373] env[60024]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 515.644606] env[60024]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 515.644606] env[60024]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=60024) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 518.904025] env[60024]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-fb647e47-1e58-4def-a511-f643338c9df9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 518.920080] env[60024]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=60024) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 518.920080] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-c98e223e-4f7c-43ff-b595-4b58d2c92196 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 518.953636] env[60024]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 83331. [ 518.953837] env[60024]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.309s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 518.954584] env[60024]: INFO nova.virt.vmwareapi.driver [None req-c5354712-db8d-46f2-b61c-d02679c3988f None None] VMware vCenter version: 7.0.3 [ 518.957986] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b652a4b3-2c90-4a58-bf12-24399abd58f0 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 518.975970] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43d1148d-f31f-44e6-85c8-f9dca4677254 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 518.982459] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98835ba4-ed78-4287-9986-1289f435cf79 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 518.989460] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7383885-f477-4ca1-a456-251e11f5053a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.004165] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e4550bf-b1bd-4cd1-916a-9150d1de2120 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.011201] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c24ea364-5df0-4859-a11c-ed1f588a4552 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.041277] env[60024]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-d80248f7-c9ea-49f8-823f-966aa5e8fcf1 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.047458] env[60024]: DEBUG nova.virt.vmwareapi.driver [None req-c5354712-db8d-46f2-b61c-d02679c3988f None None] Extension org.openstack.compute already exists. {{(pid=60024) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 519.050257] env[60024]: INFO nova.compute.provider_config [None req-c5354712-db8d-46f2-b61c-d02679c3988f None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 519.067350] env[60024]: DEBUG nova.context [None req-c5354712-db8d-46f2-b61c-d02679c3988f None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),74c00a1b-a396-4090-b2c1-a2ad7f5ace70(cell1) {{(pid=60024) load_cells /opt/stack/nova/nova/context.py:464}} [ 519.069367] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 519.069589] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 519.070381] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 519.070742] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] Acquiring lock "74c00a1b-a396-4090-b2c1-a2ad7f5ace70" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 519.070938] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] Lock "74c00a1b-a396-4090-b2c1-a2ad7f5ace70" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 519.071944] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] Lock "74c00a1b-a396-4090-b2c1-a2ad7f5ace70" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 519.084792] env[60024]: DEBUG oslo_db.sqlalchemy.engines [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=60024) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 519.085451] env[60024]: DEBUG oslo_db.sqlalchemy.engines [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=60024) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 519.092507] env[60024]: ERROR nova.db.main.api [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 519.092507] env[60024]: result = function(*args, **kwargs) [ 519.092507] env[60024]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 519.092507] env[60024]: return func(*args, **kwargs) [ 519.092507] env[60024]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 519.092507] env[60024]: result = fn(*args, **kwargs) [ 519.092507] env[60024]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 519.092507] env[60024]: return f(*args, **kwargs) [ 519.092507] env[60024]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 519.092507] env[60024]: return db.service_get_minimum_version(context, binaries) [ 519.092507] env[60024]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 519.092507] env[60024]: _check_db_access() [ 519.092507] env[60024]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 519.092507] env[60024]: stacktrace = ''.join(traceback.format_stack()) [ 519.092507] env[60024]: [ 519.093433] env[60024]: ERROR nova.db.main.api [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 519.093433] env[60024]: result = function(*args, **kwargs) [ 519.093433] env[60024]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 519.093433] env[60024]: return func(*args, **kwargs) [ 519.093433] env[60024]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 519.093433] env[60024]: result = fn(*args, **kwargs) [ 519.093433] env[60024]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 519.093433] env[60024]: return f(*args, **kwargs) [ 519.093433] env[60024]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 519.093433] env[60024]: return db.service_get_minimum_version(context, binaries) [ 519.093433] env[60024]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 519.093433] env[60024]: _check_db_access() [ 519.093433] env[60024]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 519.093433] env[60024]: stacktrace = ''.join(traceback.format_stack()) [ 519.093433] env[60024]: [ 519.093796] env[60024]: WARNING nova.objects.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 519.093949] env[60024]: WARNING nova.objects.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] Failed to get minimum service version for cell 74c00a1b-a396-4090-b2c1-a2ad7f5ace70 [ 519.094408] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] Acquiring lock "singleton_lock" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 519.094575] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] Acquired lock "singleton_lock" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 519.094828] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] Releasing lock "singleton_lock" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 519.095190] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] Full set of CONF: {{(pid=60024) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 519.095338] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ******************************************************************************** {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 519.095469] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] Configuration options gathered from: {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 519.095603] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 519.095795] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 519.095923] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ================================================================================ {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 519.096146] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] allow_resize_to_same_host = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.096321] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] arq_binding_timeout = 300 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.096453] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] backdoor_port = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.096580] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] backdoor_socket = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.096745] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] block_device_allocate_retries = 60 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.096907] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] block_device_allocate_retries_interval = 3 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.097098] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cert = self.pem {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.097269] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.097436] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] compute_monitors = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.097602] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] config_dir = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.097771] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] config_drive_format = iso9660 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.097906] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.098084] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] config_source = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.098259] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] console_host = devstack {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.098426] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] control_exchange = nova {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.098585] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cpu_allocation_ratio = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.098745] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] daemon = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.098908] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] debug = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.099078] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] default_access_ip_network_name = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.099251] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] default_availability_zone = nova {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.099405] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] default_ephemeral_format = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.099642] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.099802] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] default_schedule_zone = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.099960] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] disk_allocation_ratio = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.100149] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] enable_new_services = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.100339] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] enabled_apis = ['osapi_compute'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.100506] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] enabled_ssl_apis = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.100670] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] flat_injected = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.100828] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] force_config_drive = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.100985] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] force_raw_images = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.101170] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] graceful_shutdown_timeout = 5 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.101336] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] heal_instance_info_cache_interval = 60 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.101556] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] host = cpu-1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.101728] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.101891] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] initial_disk_allocation_ratio = 1.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.102075] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] initial_ram_allocation_ratio = 1.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.102301] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.102469] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] instance_build_timeout = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.102631] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] instance_delete_interval = 300 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.102798] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] instance_format = [instance: %(uuid)s] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.102966] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] instance_name_template = instance-%08x {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.103147] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] instance_usage_audit = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.103323] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] instance_usage_audit_period = month {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.103491] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.103657] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] instances_path = /opt/stack/data/nova/instances {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.103825] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] internal_service_availability_zone = internal {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.103983] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] key = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.104163] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] live_migration_retry_count = 30 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.104331] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] log_config_append = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.104501] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.104661] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] log_dir = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.104820] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] log_file = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.104951] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] log_options = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.105130] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] log_rotate_interval = 1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.105305] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] log_rotate_interval_type = days {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.105475] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] log_rotation_type = none {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.105613] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.105744] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.105913] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.106091] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.106226] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.106393] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] long_rpc_timeout = 1800 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.106551] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] max_concurrent_builds = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.106710] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] max_concurrent_live_migrations = 1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.106870] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] max_concurrent_snapshots = 5 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.107038] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] max_local_block_devices = 3 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.107202] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] max_logfile_count = 30 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.107363] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] max_logfile_size_mb = 200 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.107521] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] maximum_instance_delete_attempts = 5 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.107691] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] metadata_listen = 0.0.0.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.107861] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] metadata_listen_port = 8775 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.108040] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] metadata_workers = 2 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.108210] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] migrate_max_retries = -1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.108382] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] mkisofs_cmd = genisoimage {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.108588] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] my_block_storage_ip = 10.180.1.21 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.108723] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] my_ip = 10.180.1.21 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.108887] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] network_allocate_retries = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.109083] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.109262] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] osapi_compute_listen = 0.0.0.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.109430] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] osapi_compute_listen_port = 8774 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.109601] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] osapi_compute_unique_server_name_scope = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.109770] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] osapi_compute_workers = 2 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.109934] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] password_length = 12 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.110118] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] periodic_enable = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.110281] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] periodic_fuzzy_delay = 60 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.110452] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] pointer_model = usbtablet {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.110620] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] preallocate_images = none {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.110782] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] publish_errors = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.110914] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] pybasedir = /opt/stack/nova {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.111094] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ram_allocation_ratio = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.111255] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] rate_limit_burst = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.111423] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] rate_limit_except_level = CRITICAL {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.111584] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] rate_limit_interval = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.111746] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] reboot_timeout = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.111907] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] reclaim_instance_interval = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.112077] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] record = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.112253] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] reimage_timeout_per_gb = 60 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.112421] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] report_interval = 120 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.112594] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] rescue_timeout = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.112756] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] reserved_host_cpus = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.112917] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] reserved_host_disk_mb = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.113088] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] reserved_host_memory_mb = 512 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.113257] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] reserved_huge_pages = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.113414] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] resize_confirm_window = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.113571] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] resize_fs_using_block_device = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.113729] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] resume_guests_state_on_host_boot = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.113900] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.114072] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] rpc_response_timeout = 60 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.114239] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] run_external_periodic_tasks = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.114411] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] running_deleted_instance_action = reap {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.114573] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] running_deleted_instance_poll_interval = 1800 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.114730] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] running_deleted_instance_timeout = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.114890] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] scheduler_instance_sync_interval = 120 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.115036] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] service_down_time = 300 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.115213] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] servicegroup_driver = db {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.115379] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] shelved_offload_time = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.115540] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] shelved_poll_interval = 3600 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.115708] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] shutdown_timeout = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.115869] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] source_is_ipv6 = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.116041] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ssl_only = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.116297] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.116469] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] sync_power_state_interval = 600 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.116630] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] sync_power_state_pool_size = 1000 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.116800] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] syslog_log_facility = LOG_USER {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.116957] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] tempdir = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.117132] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] timeout_nbd = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.117308] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] transport_url = **** {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.117470] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] update_resources_interval = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.117653] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] use_cow_images = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.118221] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] use_eventlog = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.118221] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] use_journal = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.118221] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] use_json = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.118372] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] use_rootwrap_daemon = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.118401] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] use_stderr = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.118548] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] use_syslog = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.118710] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vcpu_pin_set = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.118878] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vif_plugging_is_fatal = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.119056] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vif_plugging_timeout = 300 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.119229] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] virt_mkfs = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.119406] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] volume_usage_poll_interval = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.119566] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] watch_log_file = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.119740] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] web = /usr/share/spice-html5 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 519.119932] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_concurrency.disable_process_locking = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.120260] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.120449] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.120620] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.120794] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.121313] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.121313] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.121313] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.auth_strategy = keystone {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.121492] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.compute_link_prefix = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.121655] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.121833] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.dhcp_domain = novalocal {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.122008] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.enable_instance_password = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.122182] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.glance_link_prefix = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.122349] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.122524] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.122687] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.instance_list_per_project_cells = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.122848] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.list_records_by_skipping_down_cells = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.123028] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.local_metadata_per_cell = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.123190] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.max_limit = 1000 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.123361] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.metadata_cache_expiration = 15 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.123536] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.neutron_default_tenant_id = default {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.123705] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.use_forwarded_for = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.123870] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.use_neutron_default_nets = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.124048] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.124219] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.124392] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.124566] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.124739] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.vendordata_dynamic_targets = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.124905] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.vendordata_jsonfile_path = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.125099] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.125299] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.backend = dogpile.cache.memcached {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.125469] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.backend_argument = **** {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.125642] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.config_prefix = cache.oslo {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.125814] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.dead_timeout = 60.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.125980] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.debug_cache_backend = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.126160] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.enable_retry_client = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.126328] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.enable_socket_keepalive = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.126503] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.enabled = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.126669] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.expiration_time = 600 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.126831] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.hashclient_retry_attempts = 2 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.126996] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.hashclient_retry_delay = 1.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.127184] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.memcache_dead_retry = 300 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.127356] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.memcache_password = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.127522] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.127687] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.127852] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.memcache_pool_maxsize = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.128024] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.128194] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.memcache_sasl_enabled = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.128379] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.128547] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.memcache_socket_timeout = 1.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.128718] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.memcache_username = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.128887] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.proxies = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.129063] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.retry_attempts = 2 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.129238] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.retry_delay = 0.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.129412] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.socket_keepalive_count = 1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.129578] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.socket_keepalive_idle = 1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.129744] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.socket_keepalive_interval = 1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.129905] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.tls_allowed_ciphers = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.130098] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.tls_cafile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.130256] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.tls_certfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.130424] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.tls_enabled = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.130584] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cache.tls_keyfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.130757] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cinder.auth_section = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.130935] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cinder.auth_type = password {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.131117] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cinder.cafile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.131300] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cinder.catalog_info = volumev3::publicURL {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.131463] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cinder.certfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.131628] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cinder.collect_timing = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.131793] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cinder.cross_az_attach = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.131957] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cinder.debug = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.132137] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cinder.endpoint_template = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.132309] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cinder.http_retries = 3 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.132475] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cinder.insecure = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.132633] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cinder.keyfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.132805] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cinder.os_region_name = RegionOne {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.132972] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cinder.split_loggers = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.133149] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cinder.timeout = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.133327] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.133491] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] compute.cpu_dedicated_set = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.133653] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] compute.cpu_shared_set = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.133823] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] compute.image_type_exclude_list = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.133988] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.134169] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] compute.max_concurrent_disk_ops = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.134337] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] compute.max_disk_devices_to_attach = -1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.134505] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.134677] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.134853] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] compute.resource_provider_association_refresh = 300 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.135028] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] compute.shutdown_retry_interval = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.135222] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.135406] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] conductor.workers = 2 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.135586] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] console.allowed_origins = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.135749] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] console.ssl_ciphers = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.135922] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] console.ssl_minimum_version = default {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.136121] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] consoleauth.token_ttl = 600 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.136297] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.cafile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.136461] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.certfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.136625] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.collect_timing = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.136786] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.connect_retries = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.136945] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.connect_retry_delay = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.137117] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.endpoint_override = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.137285] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.insecure = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.137445] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.keyfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.137603] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.max_version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.137760] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.min_version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.137919] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.region_name = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.138086] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.service_name = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.138265] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.service_type = accelerator {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.138428] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.split_loggers = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.138588] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.status_code_retries = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.138747] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.status_code_retry_delay = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.138905] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.timeout = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.139099] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.139269] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] cyborg.version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.139457] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.backend = sqlalchemy {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.139638] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.connection = **** {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.139811] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.connection_debug = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.139985] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.connection_parameters = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.140175] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.connection_recycle_time = 3600 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.140356] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.connection_trace = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.140519] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.db_inc_retry_interval = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.140683] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.db_max_retries = 20 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.140848] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.db_max_retry_interval = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.141018] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.db_retry_interval = 1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.141195] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.max_overflow = 50 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.141363] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.max_pool_size = 5 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.141534] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.max_retries = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.141697] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.mysql_enable_ndb = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.141870] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.142040] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.mysql_wsrep_sync_wait = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.142210] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.pool_timeout = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.142387] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.retry_interval = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.142545] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.slave_connection = **** {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.142709] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.sqlite_synchronous = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.142870] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] database.use_db_reconnect = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.143067] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.backend = sqlalchemy {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.144832] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.connection = **** {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.145056] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.connection_debug = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.145253] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.connection_parameters = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.145435] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.connection_recycle_time = 3600 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.145620] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.connection_trace = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.145792] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.db_inc_retry_interval = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.145967] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.db_max_retries = 20 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.146157] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.db_max_retry_interval = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.146335] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.db_retry_interval = 1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.146513] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.max_overflow = 50 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.146685] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.max_pool_size = 5 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.146860] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.max_retries = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.147041] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.mysql_enable_ndb = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.147225] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.147397] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.147568] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.pool_timeout = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.147745] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.retry_interval = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.147912] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.slave_connection = **** {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.148098] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] api_database.sqlite_synchronous = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.148295] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] devices.enabled_mdev_types = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.148476] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.148646] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ephemeral_storage_encryption.enabled = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.148817] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.148990] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.api_servers = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.149175] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.cafile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.149350] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.certfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.149523] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.collect_timing = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.149688] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.connect_retries = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.149855] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.connect_retry_delay = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.150041] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.debug = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.150293] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.default_trusted_certificate_ids = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.150397] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.enable_certificate_validation = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.150568] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.enable_rbd_download = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.150732] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.endpoint_override = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.150906] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.insecure = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.151092] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.keyfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.151268] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.max_version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.151449] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.min_version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.151622] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.num_retries = 3 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.151801] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.rbd_ceph_conf = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.151972] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.rbd_connect_timeout = 5 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.152163] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.rbd_pool = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.152343] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.rbd_user = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.152509] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.region_name = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.152673] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.service_name = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.152849] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.service_type = image {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.153027] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.split_loggers = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.153200] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.status_code_retries = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.153370] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.status_code_retry_delay = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.153535] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.timeout = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.153726] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.153897] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.verify_glance_signatures = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.154109] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] glance.version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.154259] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] guestfs.debug = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.154436] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] hyperv.config_drive_cdrom = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.154605] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] hyperv.config_drive_inject_password = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.154775] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.154944] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] hyperv.enable_instance_metrics_collection = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.155123] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] hyperv.enable_remotefx = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.155305] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] hyperv.instances_path_share = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.155477] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] hyperv.iscsi_initiator_list = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.155645] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] hyperv.limit_cpu_features = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.155813] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.155981] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.156169] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] hyperv.power_state_check_timeframe = 60 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.156339] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.156514] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.156682] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] hyperv.use_multipath_io = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.156849] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] hyperv.volume_attach_retry_count = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.157025] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.157194] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] hyperv.vswitch_name = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.157364] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.157537] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] mks.enabled = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.157900] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.158106] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] image_cache.manager_interval = 2400 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.158285] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] image_cache.precache_concurrency = 1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.158465] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] image_cache.remove_unused_base_images = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.158629] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.158799] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.158979] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] image_cache.subdirectory_name = _base {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.159173] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.api_max_retries = 60 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.159347] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.api_retry_interval = 2 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.159513] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.auth_section = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.159679] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.auth_type = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.159843] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.cafile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.160021] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.certfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.160188] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.collect_timing = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.160356] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.connect_retries = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.160519] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.connect_retry_delay = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.160682] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.endpoint_override = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.160847] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.insecure = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.161017] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.keyfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.161183] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.max_version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.161348] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.min_version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.161509] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.partition_key = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.161677] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.peer_list = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.161839] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.region_name = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.162011] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.serial_console_state_timeout = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.162183] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.service_name = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.162360] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.service_type = baremetal {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.162525] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.split_loggers = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.162686] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.status_code_retries = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.162846] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.status_code_retry_delay = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.163015] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.timeout = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.163212] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.163379] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ironic.version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.163565] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.163744] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] key_manager.fixed_key = **** {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.163930] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.164124] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican.barbican_api_version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.164298] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican.barbican_endpoint = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.164489] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican.barbican_endpoint_type = public {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.164655] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican.barbican_region_name = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.164817] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican.cafile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.164979] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican.certfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.165159] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican.collect_timing = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.166751] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican.insecure = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.166751] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican.keyfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.166751] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican.number_of_retries = 60 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.166751] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican.retry_delay = 1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.166751] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican.send_service_user_token = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.166751] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican.split_loggers = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.166751] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican.timeout = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.167107] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican.verify_ssl = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.167107] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican.verify_ssl_path = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.167107] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican_service_user.auth_section = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.167107] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican_service_user.auth_type = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.167107] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican_service_user.cafile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.167261] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican_service_user.certfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.167429] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican_service_user.collect_timing = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.167596] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican_service_user.insecure = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.167758] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican_service_user.keyfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.167923] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican_service_user.split_loggers = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.168096] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] barbican_service_user.timeout = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.168274] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vault.approle_role_id = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.168439] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vault.approle_secret_id = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.168602] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vault.cafile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.168762] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vault.certfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.168928] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vault.collect_timing = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.169104] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vault.insecure = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.169272] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vault.keyfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.169446] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vault.kv_mountpoint = secret {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.169611] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vault.kv_version = 2 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.169774] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vault.namespace = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.169935] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vault.root_token_id = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.170129] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vault.split_loggers = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.170294] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vault.ssl_ca_crt_file = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.170457] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vault.timeout = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.170625] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vault.use_ssl = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.170800] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.170968] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.cafile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.171147] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.certfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.171318] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.collect_timing = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.171482] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.connect_retries = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.171646] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.connect_retry_delay = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.171807] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.endpoint_override = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.171971] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.insecure = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.172148] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.keyfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.172317] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.max_version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.172479] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.min_version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.172638] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.region_name = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.172801] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.service_name = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.172976] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.service_type = identity {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.173157] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.split_loggers = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.173325] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.status_code_retries = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.173490] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.status_code_retry_delay = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.173652] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.timeout = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.173841] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.174020] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] keystone.version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.174236] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.connection_uri = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.174407] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.cpu_mode = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.174582] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.cpu_model_extra_flags = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.174756] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.cpu_models = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.174934] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.cpu_power_governor_high = performance {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.175125] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.cpu_power_governor_low = powersave {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.175302] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.cpu_power_management = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.175481] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.175651] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.device_detach_attempts = 8 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.175821] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.device_detach_timeout = 20 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.175991] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.disk_cachemodes = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.176171] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.disk_prefix = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.176346] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.enabled_perf_events = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.176517] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.file_backed_memory = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.176689] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.gid_maps = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.176853] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.hw_disk_discard = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.177026] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.hw_machine_type = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.177211] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.images_rbd_ceph_conf = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.177386] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.177562] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.177739] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.images_rbd_glance_store_name = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.177915] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.images_rbd_pool = rbd {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.178104] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.images_type = default {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.178276] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.images_volume_group = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.178445] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.inject_key = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.178611] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.inject_partition = -2 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.178777] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.inject_password = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.178938] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.iscsi_iface = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.179114] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.iser_use_multipath = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.179286] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.live_migration_bandwidth = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.179451] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.179616] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.live_migration_downtime = 500 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.179780] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.179944] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.180128] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.live_migration_inbound_addr = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.180300] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.180464] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.live_migration_permit_post_copy = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.180633] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.live_migration_scheme = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.180816] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.live_migration_timeout_action = abort {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.180986] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.live_migration_tunnelled = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.181167] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.live_migration_uri = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.181335] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.live_migration_with_native_tls = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.181498] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.max_queues = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.181666] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.181828] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.nfs_mount_options = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.182177] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.182358] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.182529] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.num_iser_scan_tries = 5 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.182695] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.num_memory_encrypted_guests = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.182865] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.183040] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.num_pcie_ports = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.183213] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.num_volume_scan_tries = 5 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.183388] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.pmem_namespaces = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.183548] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.quobyte_client_cfg = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.183838] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.184024] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.rbd_connect_timeout = 5 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.184201] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.184371] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.184534] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.rbd_secret_uuid = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.184696] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.rbd_user = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.184863] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.185052] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.remote_filesystem_transport = ssh {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.185222] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.rescue_image_id = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.185804] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.rescue_kernel_id = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.185804] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.rescue_ramdisk_id = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.185804] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.185936] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.rx_queue_size = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.186049] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.smbfs_mount_options = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.186338] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.186516] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.snapshot_compression = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.186679] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.snapshot_image_format = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.186903] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.187085] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.sparse_logical_volumes = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.187261] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.swtpm_enabled = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.187433] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.swtpm_group = tss {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.187604] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.swtpm_user = tss {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.187774] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.sysinfo_serial = unique {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.187935] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.tx_queue_size = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.188121] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.uid_maps = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.188287] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.use_virtio_for_bridges = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.188463] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.virt_type = kvm {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.188634] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.volume_clear = zero {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.188800] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.volume_clear_size = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.188967] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.volume_use_multipath = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.189142] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.vzstorage_cache_path = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.189318] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.189490] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.vzstorage_mount_group = qemu {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.189660] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.vzstorage_mount_opts = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.189832] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.190130] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.190317] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.vzstorage_mount_user = stack {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.190490] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.190671] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.auth_section = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.190848] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.auth_type = password {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.191026] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.cafile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.191198] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.certfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.191368] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.collect_timing = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.191533] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.connect_retries = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.191698] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.connect_retry_delay = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.191873] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.default_floating_pool = public {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.192048] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.endpoint_override = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.192222] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.extension_sync_interval = 600 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.192392] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.http_retries = 3 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.192558] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.insecure = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.192722] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.keyfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.192885] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.max_version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.193072] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.193241] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.min_version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.193415] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.ovs_bridge = br-int {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.193584] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.physnets = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.193757] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.region_name = RegionOne {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.193928] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.service_metadata_proxy = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.194106] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.service_name = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.194287] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.service_type = network {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.194454] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.split_loggers = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.194615] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.status_code_retries = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.194776] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.status_code_retry_delay = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.194939] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.timeout = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.195138] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.195310] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] neutron.version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.195487] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] notifications.bdms_in_notifications = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.195670] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] notifications.default_level = INFO {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.195850] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] notifications.notification_format = unversioned {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.196026] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] notifications.notify_on_state_change = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.196214] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.196400] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] pci.alias = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.196577] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] pci.device_spec = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.196745] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] pci.report_in_placement = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.196923] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.auth_section = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.197114] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.auth_type = password {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.197291] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.197456] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.cafile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.197617] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.certfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.197780] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.collect_timing = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.197944] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.connect_retries = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.198122] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.connect_retry_delay = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.198289] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.default_domain_id = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.198453] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.default_domain_name = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.198615] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.domain_id = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.198777] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.domain_name = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.198938] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.endpoint_override = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.199117] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.insecure = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.199283] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.keyfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.199445] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.max_version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.199605] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.min_version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.199775] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.password = **** {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.199938] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.project_domain_id = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.200126] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.project_domain_name = Default {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.200296] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.project_id = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.200471] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.project_name = service {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.200644] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.region_name = RegionOne {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.200806] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.service_name = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.200978] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.service_type = placement {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.201162] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.split_loggers = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.201328] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.status_code_retries = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.201491] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.status_code_retry_delay = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.201653] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.system_scope = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.201814] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.timeout = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.201974] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.trust_id = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.202149] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.user_domain_id = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.202325] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.user_domain_name = Default {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.202494] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.user_id = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.202670] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.username = placement {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.202856] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.203029] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] placement.version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.203217] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] quota.cores = 20 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.203388] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] quota.count_usage_from_placement = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.203565] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.203744] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] quota.injected_file_content_bytes = 10240 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.203915] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] quota.injected_file_path_length = 255 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.204095] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] quota.injected_files = 5 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.204273] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] quota.instances = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.204443] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] quota.key_pairs = 100 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.204612] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] quota.metadata_items = 128 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.204778] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] quota.ram = 51200 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.204946] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] quota.recheck_quota = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.205130] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] quota.server_group_members = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.205309] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] quota.server_groups = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.205478] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] rdp.enabled = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.205802] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.205996] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.206185] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.206356] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] scheduler.image_metadata_prefilter = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.206524] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.206691] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] scheduler.max_attempts = 3 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.206859] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] scheduler.max_placement_results = 1000 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.207037] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.207207] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] scheduler.query_placement_for_availability_zone = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.207377] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] scheduler.query_placement_for_image_type_support = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.207539] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.207714] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] scheduler.workers = 2 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.207890] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.208081] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.208269] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.208442] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.208616] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.208786] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.208953] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.209163] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.209343] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.host_subset_size = 1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.209507] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.209674] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.209842] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.isolated_hosts = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.210016] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.isolated_images = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.210192] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.210360] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.210525] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.pci_in_placement = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.210690] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.210857] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.211030] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.211204] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.211374] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.211547] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.211714] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.track_instance_changes = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.211897] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.212083] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] metrics.required = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.212258] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] metrics.weight_multiplier = 1.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.212427] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.212595] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] metrics.weight_setting = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.212906] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.213098] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] serial_console.enabled = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.213288] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] serial_console.port_range = 10000:20000 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.213467] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.213642] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.213814] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] serial_console.serialproxy_port = 6083 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.213986] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] service_user.auth_section = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.214179] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] service_user.auth_type = password {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.214347] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] service_user.cafile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.214509] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] service_user.certfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.214675] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] service_user.collect_timing = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.214840] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] service_user.insecure = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.215012] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] service_user.keyfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.215188] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] service_user.send_service_user_token = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.215359] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] service_user.split_loggers = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.215521] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] service_user.timeout = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.215695] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] spice.agent_enabled = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.215874] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] spice.enabled = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.216199] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.216406] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.216587] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] spice.html5proxy_port = 6082 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.216753] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] spice.image_compression = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.216917] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] spice.jpeg_compression = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.217093] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] spice.playback_compression = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.217277] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] spice.server_listen = 127.0.0.1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.217454] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.217619] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] spice.streaming_mode = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.217783] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] spice.zlib_compression = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.217950] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] upgrade_levels.baseapi = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.218130] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] upgrade_levels.cert = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.218311] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] upgrade_levels.compute = auto {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.218476] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] upgrade_levels.conductor = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.218639] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] upgrade_levels.scheduler = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.218809] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vendordata_dynamic_auth.auth_section = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.218975] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vendordata_dynamic_auth.auth_type = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.219150] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vendordata_dynamic_auth.cafile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.219316] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vendordata_dynamic_auth.certfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.219480] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.219643] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vendordata_dynamic_auth.insecure = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.219805] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vendordata_dynamic_auth.keyfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.219969] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.220154] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vendordata_dynamic_auth.timeout = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.220330] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.api_retry_count = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.220496] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.ca_file = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.220672] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.cache_prefix = devstack-image-cache {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.220844] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.cluster_name = testcl1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.221018] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.connection_pool_size = 10 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.221191] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.console_delay_seconds = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.221366] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.datastore_regex = ^datastore.* {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.221588] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.221764] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.host_password = **** {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.221935] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.host_port = 443 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.222122] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.host_username = administrator@vsphere.local {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.222302] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.insecure = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.222468] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.integration_bridge = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.222638] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.maximum_objects = 100 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.222802] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.pbm_default_policy = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.222968] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.pbm_enabled = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.223145] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.pbm_wsdl_location = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.223358] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.223540] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.serial_port_proxy_uri = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.223707] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.serial_port_service_uri = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.223878] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.task_poll_interval = 0.5 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.224068] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.use_linked_clone = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.224251] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.vnc_keymap = en-us {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.224421] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.vnc_port = 5900 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.224587] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vmware.vnc_port_total = 10000 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.224780] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vnc.auth_schemes = ['none'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.224962] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vnc.enabled = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.225287] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.225483] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.225662] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vnc.novncproxy_port = 6080 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.225846] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vnc.server_listen = 127.0.0.1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.226037] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.226214] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vnc.vencrypt_ca_certs = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.226379] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vnc.vencrypt_client_cert = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.226542] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vnc.vencrypt_client_key = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.226725] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.226895] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.disable_deep_image_inspection = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.227074] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.227246] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.227410] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.227573] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.disable_rootwrap = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.227743] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.enable_numa_live_migration = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.227899] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.228074] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.228246] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.228409] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.libvirt_disable_apic = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.228573] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.228737] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.228900] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.229076] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.229245] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.229409] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.229571] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.229733] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.229894] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.230112] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.230281] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.230460] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] wsgi.client_socket_timeout = 900 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.230631] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] wsgi.default_pool_size = 1000 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.230801] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] wsgi.keep_alive = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.230972] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] wsgi.max_header_line = 16384 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.231152] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] wsgi.secure_proxy_ssl_header = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.231323] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] wsgi.ssl_ca_file = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.231489] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] wsgi.ssl_cert_file = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.231653] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] wsgi.ssl_key_file = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.231819] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] wsgi.tcp_keepidle = 600 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.231998] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.232183] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] zvm.ca_file = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.232356] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] zvm.cloud_connector_url = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.232658] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.232834] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] zvm.reachable_timeout = 300 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.233031] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_policy.enforce_new_defaults = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.233212] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_policy.enforce_scope = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.233397] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_policy.policy_default_rule = default {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.233583] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.233765] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_policy.policy_file = policy.yaml {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.233945] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.234126] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.234294] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.234457] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.234623] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.234796] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.234976] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.235170] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] profiler.connection_string = messaging:// {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.235346] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] profiler.enabled = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.235521] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] profiler.es_doc_type = notification {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.235689] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] profiler.es_scroll_size = 10000 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.235861] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] profiler.es_scroll_time = 2m {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.236037] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] profiler.filter_error_trace = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.236224] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] profiler.hmac_keys = SECRET_KEY {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.236391] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] profiler.sentinel_service_name = mymaster {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.236566] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] profiler.socket_timeout = 0.1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.236733] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] profiler.trace_sqlalchemy = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.236903] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] remote_debug.host = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.237079] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] remote_debug.port = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.237269] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.237438] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.237606] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.237772] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.237938] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.238117] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.238287] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.238455] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.238621] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.238787] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.238962] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.239142] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.239319] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.239487] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.239650] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.239828] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.239994] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.240172] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.240342] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.240510] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.240673] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.240841] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.241023] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.241188] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.241357] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.241525] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.ssl = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.241699] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.241872] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.242046] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.242225] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.242403] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_rabbit.ssl_version = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.242590] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.242759] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_notifications.retry = -1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.242943] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.243130] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_messaging_notifications.transport_url = **** {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.243306] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.auth_section = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.243467] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.auth_type = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.243627] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.cafile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.243784] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.certfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.243948] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.collect_timing = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.244120] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.connect_retries = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.244287] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.connect_retry_delay = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.244445] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.endpoint_id = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.244605] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.endpoint_override = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.244768] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.insecure = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.244925] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.keyfile = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.245096] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.max_version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.245259] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.min_version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.245415] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.region_name = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.245571] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.service_name = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.245728] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.service_type = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.245890] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.split_loggers = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.246060] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.status_code_retries = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.246226] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.status_code_retry_delay = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.246389] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.timeout = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.246547] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.valid_interfaces = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.246704] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_limit.version = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.246867] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_reports.file_event_handler = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.247041] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.247205] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] oslo_reports.log_dir = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.247379] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.247539] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.247699] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.247865] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.248042] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.248209] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.248381] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.248540] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vif_plug_ovs_privileged.group = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.248698] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.248865] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.249040] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.249204] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] vif_plug_ovs_privileged.user = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.249379] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] os_vif_linux_bridge.flat_interface = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.249561] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.249736] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.249912] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.250099] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.250273] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.250442] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.250605] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.250784] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] os_vif_ovs.isolate_vif = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.250954] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.251139] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.251324] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.251515] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] os_vif_ovs.ovsdb_interface = native {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.251683] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] os_vif_ovs.per_port_bridge = False {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.251855] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] os_brick.lock_path = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.252036] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.252210] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] os_brick.wait_mpath_device_interval = 1 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.252389] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] privsep_osbrick.capabilities = [21] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.252551] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] privsep_osbrick.group = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.252711] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] privsep_osbrick.helper_command = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.252879] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.253058] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.253224] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] privsep_osbrick.user = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.253402] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.253563] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] nova_sys_admin.group = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.253722] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] nova_sys_admin.helper_command = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.253890] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.254065] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.254231] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] nova_sys_admin.user = None {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 519.254362] env[60024]: DEBUG oslo_service.service [None req-f1728188-8f06-42e2-8cd9-47bc5734cf40 None None] ******************************************************************************** {{(pid=60024) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 519.254793] env[60024]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 519.263506] env[60024]: INFO nova.virt.node [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Generated node identity 5b70561f-4086-4d22-a0b6-aa1035435329 [ 519.263759] env[60024]: INFO nova.virt.node [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Wrote node identity 5b70561f-4086-4d22-a0b6-aa1035435329 to /opt/stack/data/n-cpu-1/compute_id [ 519.275651] env[60024]: WARNING nova.compute.manager [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Compute nodes ['5b70561f-4086-4d22-a0b6-aa1035435329'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 519.305982] env[60024]: INFO nova.compute.manager [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 519.329390] env[60024]: WARNING nova.compute.manager [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 519.329659] env[60024]: DEBUG oslo_concurrency.lockutils [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 519.329978] env[60024]: DEBUG oslo_concurrency.lockutils [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 519.329978] env[60024]: DEBUG oslo_concurrency.lockutils [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 519.330202] env[60024]: DEBUG nova.compute.resource_tracker [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60024) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 519.331330] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-650dc873-cee2-4df3-97a0-0548e1f57d75 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.340487] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8df34d3b-03c1-4714-aa96-77b5452033c3 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.355260] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca7c3afe-625d-49f6-96f5-dd056450f2ec {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.362034] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbd339b8-6221-4a64-af19-92a9f217a7e2 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.390939] env[60024]: DEBUG nova.compute.resource_tracker [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180706MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60024) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 519.391134] env[60024]: DEBUG oslo_concurrency.lockutils [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 519.391300] env[60024]: DEBUG oslo_concurrency.lockutils [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 519.403628] env[60024]: WARNING nova.compute.resource_tracker [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] No compute node record for cpu-1:5b70561f-4086-4d22-a0b6-aa1035435329: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 5b70561f-4086-4d22-a0b6-aa1035435329 could not be found. [ 519.419265] env[60024]: INFO nova.compute.resource_tracker [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 5b70561f-4086-4d22-a0b6-aa1035435329 [ 519.469987] env[60024]: DEBUG nova.compute.resource_tracker [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 519.470299] env[60024]: DEBUG nova.compute.resource_tracker [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=100GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 519.576965] env[60024]: INFO nova.scheduler.client.report [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] [req-7faf1cbb-1924-4a4f-9234-60aa5995f06e] Created resource provider record via placement API for resource provider with UUID 5b70561f-4086-4d22-a0b6-aa1035435329 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 519.596075] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2b0e44a-5d69-48cc-94d8-5d99e4ea7e0c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.602227] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43941703-9ddf-413c-b859-ad4a6a96426e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.632439] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-611544c5-0dd7-4d2e-8cab-ffc13816350c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.641556] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-703c73bd-8092-478c-aded-ce368ca00fd9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 519.655549] env[60024]: DEBUG nova.compute.provider_tree [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Updating inventory in ProviderTree for provider 5b70561f-4086-4d22-a0b6-aa1035435329 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 519.695059] env[60024]: DEBUG nova.scheduler.client.report [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Updated inventory for provider 5b70561f-4086-4d22-a0b6-aa1035435329 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 519.695059] env[60024]: DEBUG nova.compute.provider_tree [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Updating resource provider 5b70561f-4086-4d22-a0b6-aa1035435329 generation from 0 to 1 during operation: update_inventory {{(pid=60024) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 519.695059] env[60024]: DEBUG nova.compute.provider_tree [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Updating inventory in ProviderTree for provider 5b70561f-4086-4d22-a0b6-aa1035435329 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 519.739095] env[60024]: DEBUG nova.compute.provider_tree [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Updating resource provider 5b70561f-4086-4d22-a0b6-aa1035435329 generation from 1 to 2 during operation: update_traits {{(pid=60024) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 519.754654] env[60024]: DEBUG nova.compute.resource_tracker [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60024) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 519.757010] env[60024]: DEBUG oslo_concurrency.lockutils [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.366s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 519.757386] env[60024]: DEBUG nova.service [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Creating RPC server for service compute {{(pid=60024) start /opt/stack/nova/nova/service.py:182}} [ 519.772176] env[60024]: DEBUG nova.service [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] Join ServiceGroup membership for this service compute {{(pid=60024) start /opt/stack/nova/nova/service.py:199}} [ 519.772176] env[60024]: DEBUG nova.servicegroup.drivers.db [None req-afe55959-f162-41bd-9bd4-e0a5011dc2d0 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=60024) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 543.774749] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 543.785917] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Getting list of instances from cluster (obj){ [ 543.785917] env[60024]: value = "domain-c8" [ 543.785917] env[60024]: _type = "ClusterComputeResource" [ 543.785917] env[60024]: } {{(pid=60024) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 543.789033] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3626d4d-b745-4bd7-a01c-750dd9e8ef7f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 543.797813] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Got total of 0 instances {{(pid=60024) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 543.798256] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 543.798687] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Getting list of instances from cluster (obj){ [ 543.798687] env[60024]: value = "domain-c8" [ 543.798687] env[60024]: _type = "ClusterComputeResource" [ 543.798687] env[60024]: } {{(pid=60024) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 543.799791] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e00f7470-37e9-4c85-934b-66c5eaf69056 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 543.807912] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Got total of 0 instances {{(pid=60024) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 554.602920] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Acquiring lock "15e44d1f-ae9b-4ff7-841c-90acc81cf38b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 554.602920] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Lock "15e44d1f-ae9b-4ff7-841c-90acc81cf38b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 554.632596] env[60024]: DEBUG nova.compute.manager [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 554.746825] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 554.748045] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 554.754255] env[60024]: INFO nova.compute.claims [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 554.909247] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-695db624-aaec-48ba-83c1-76c4bad8ae93 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.925075] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6542974-f1bd-4f40-9821-5327b7ade3fe {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.967320] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cfa29c19-9901-4be4-8a83-76db5a3a476c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.976234] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49f5bc16-9a0e-4301-b3ab-c4c6cbff6cd2 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 554.994098] env[60024]: DEBUG nova.compute.provider_tree [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 555.006631] env[60024]: DEBUG nova.scheduler.client.report [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 555.032055] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.284s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 555.032658] env[60024]: DEBUG nova.compute.manager [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 555.072301] env[60024]: DEBUG nova.compute.utils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 555.074103] env[60024]: DEBUG nova.compute.manager [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Not allocating networking since 'none' was specified. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 555.093830] env[60024]: DEBUG nova.compute.manager [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 555.193026] env[60024]: DEBUG nova.compute.manager [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 557.030775] env[60024]: DEBUG nova.virt.hardware [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 557.031369] env[60024]: DEBUG nova.virt.hardware [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 557.031602] env[60024]: DEBUG nova.virt.hardware [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 557.031875] env[60024]: DEBUG nova.virt.hardware [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 557.032019] env[60024]: DEBUG nova.virt.hardware [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 557.032184] env[60024]: DEBUG nova.virt.hardware [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 557.032409] env[60024]: DEBUG nova.virt.hardware [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 557.032937] env[60024]: DEBUG nova.virt.hardware [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 557.034517] env[60024]: DEBUG nova.virt.hardware [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 557.034966] env[60024]: DEBUG nova.virt.hardware [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 557.034966] env[60024]: DEBUG nova.virt.hardware [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 557.037064] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c80e7867-cbe7-4d2c-acae-9caac0b45baf {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.047296] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-699ab527-e914-4716-81c2-ef26377bee73 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.075948] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d57c4b4c-52cb-4551-99f8-2207f1b89b11 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.098767] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Instance VIF info [] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 557.107051] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 557.107429] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a8639ee8-eaa0-432f-a9b1-9f2bcd031240 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.121102] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Created folder: OpenStack in parent group-v4. [ 557.121317] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Creating folder: Project (b9d958b398104133b85a3f555684d617). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 557.121566] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c2fa8c87-5c5e-4466-ab55-2c46f6245924 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.132979] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Created folder: Project (b9d958b398104133b85a3f555684d617) in parent group-v894073. [ 557.134275] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Creating folder: Instances. Parent ref: group-v894074. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 557.134275] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-927a3996-a050-41ef-93e7-a3dde3e7b4eb {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.146282] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Created folder: Instances in parent group-v894074. [ 557.146282] env[60024]: DEBUG oslo.service.loopingcall [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 557.146282] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 557.146282] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fb340e43-d266-4ae5-8b28-e2b1b1b93a59 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.163477] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 557.163477] env[60024]: value = "task-4576202" [ 557.163477] env[60024]: _type = "Task" [ 557.163477] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 557.173319] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576202, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 557.673497] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576202, 'name': CreateVM_Task, 'duration_secs': 0.28655} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 557.673776] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 557.674668] env[60024]: DEBUG oslo_vmware.service [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32ae2acb-08dc-438e-bf80-2e1e62af5b76 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.681569] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 557.681733] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 557.682408] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 557.682642] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a63ce16f-f244-4a70-91b4-f2167238490b {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.688248] env[60024]: DEBUG oslo_vmware.api [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Waiting for the task: (returnval){ [ 557.688248] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]5269526a-ddd8-f958-d264-0e2a46269c32" [ 557.688248] env[60024]: _type = "Task" [ 557.688248] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 557.706387] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 557.706387] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 557.706387] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 557.706387] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 557.707278] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 557.707278] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6e98dbf4-1621-4e92-8657-938f80d38d0e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.727410] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 557.727673] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 557.728519] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02022880-4cbb-47e7-a375-a737149ae538 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.739142] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0ca21429-ed97-4ed8-8250-d534b1605727 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 557.743108] env[60024]: DEBUG oslo_vmware.api [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Waiting for the task: (returnval){ [ 557.743108] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]525ce27b-f93c-b904-628d-0cde0134227c" [ 557.743108] env[60024]: _type = "Task" [ 557.743108] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 557.753281] env[60024]: DEBUG oslo_vmware.api [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]525ce27b-f93c-b904-628d-0cde0134227c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 558.259393] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 558.259755] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Creating directory with path [datastore2] vmware_temp/4aa2014d-9089-4073-95bc-66da8f5f98e6/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 558.260054] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2f54817c-c9be-4519-a325-f8fabdc118b7 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.284074] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Created directory with path [datastore2] vmware_temp/4aa2014d-9089-4073-95bc-66da8f5f98e6/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 558.284206] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Fetch image to [datastore2] vmware_temp/4aa2014d-9089-4073-95bc-66da8f5f98e6/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 558.284315] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/4aa2014d-9089-4073-95bc-66da8f5f98e6/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 558.285276] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ec24fb1-eb01-49cd-abbd-9a30e75762a9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.293380] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4d87784-619e-4f1c-9106-1fb7267ca2e4 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.304099] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-916298ee-c5e5-4b37-97ac-8becfa15b7a5 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.336668] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c161cd00-7b65-4df8-aeb2-f154f02bbd8e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.345512] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a13c9f26-bb76-4fe2-8d2e-a43c185730ed {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 558.379593] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 558.554049] env[60024]: DEBUG oslo_vmware.rw_handles [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4aa2014d-9089-4073-95bc-66da8f5f98e6/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 558.617844] env[60024]: DEBUG oslo_vmware.rw_handles [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Completed reading data from the image iterator. {{(pid=60024) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 558.617844] env[60024]: DEBUG oslo_vmware.rw_handles [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4aa2014d-9089-4073-95bc-66da8f5f98e6/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 561.244945] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Acquiring lock "8a0d9829-6759-4593-9230-459a546a5908" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 561.245546] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Lock "8a0d9829-6759-4593-9230-459a546a5908" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 561.267022] env[60024]: DEBUG nova.compute.manager [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 561.328245] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 561.328404] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 561.332764] env[60024]: INFO nova.compute.claims [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 561.454019] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32d37b37-baf6-4cde-b2ec-e9b197fcb531 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.462792] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a85d489d-3ada-4695-ab92-27264926daed {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.500171] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fea2a83-6b6d-49f3-93ea-c8c0f8fdf6fa {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.508732] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e33f8dff-5f04-4043-bc86-a89dff339a1d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.524551] env[60024]: DEBUG nova.compute.provider_tree [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 561.537436] env[60024]: DEBUG nova.scheduler.client.report [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 561.564293] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.236s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 561.568645] env[60024]: DEBUG nova.compute.manager [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 561.606159] env[60024]: DEBUG nova.compute.utils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 561.606956] env[60024]: DEBUG nova.compute.manager [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 561.607248] env[60024]: DEBUG nova.network.neutron [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 561.624136] env[60024]: DEBUG nova.compute.manager [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 561.715626] env[60024]: DEBUG nova.policy [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5208b8d162bc46d489a34997aaebbaa2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4b3edaaa3fdc4f73b49b8e57e04b8fa0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 561.718840] env[60024]: DEBUG nova.compute.manager [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 561.757398] env[60024]: DEBUG nova.virt.hardware [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 561.757866] env[60024]: DEBUG nova.virt.hardware [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 561.758168] env[60024]: DEBUG nova.virt.hardware [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 561.758470] env[60024]: DEBUG nova.virt.hardware [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 561.758741] env[60024]: DEBUG nova.virt.hardware [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 561.758997] env[60024]: DEBUG nova.virt.hardware [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 561.759325] env[60024]: DEBUG nova.virt.hardware [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 561.760418] env[60024]: DEBUG nova.virt.hardware [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 561.760418] env[60024]: DEBUG nova.virt.hardware [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 561.760418] env[60024]: DEBUG nova.virt.hardware [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 561.760418] env[60024]: DEBUG nova.virt.hardware [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 561.761432] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c8af2d3-220f-46c7-89f7-5370f7473376 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 561.775701] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0708b94f-e050-4cb8-9de1-d933bd88f82c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 563.094114] env[60024]: DEBUG nova.network.neutron [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Successfully created port: e709139c-f29a-48b4-88ea-ff5a533c2b9b {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 566.430319] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Acquiring lock "c4400e80-4457-4a8a-8588-f594e5993cde" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 566.430319] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Lock "c4400e80-4457-4a8a-8588-f594e5993cde" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 566.448573] env[60024]: DEBUG nova.compute.manager [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 566.531677] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 566.531959] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 566.536994] env[60024]: INFO nova.compute.claims [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 566.678012] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d11deca0-24aa-410d-9935-b38403494cfb {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.689894] env[60024]: DEBUG nova.network.neutron [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Successfully updated port: e709139c-f29a-48b4-88ea-ff5a533c2b9b {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 566.691986] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7ebe972-a395-4435-b005-7636319b0bf3 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.703446] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Acquiring lock "refresh_cache-8a0d9829-6759-4593-9230-459a546a5908" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 566.703596] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Acquired lock "refresh_cache-8a0d9829-6759-4593-9230-459a546a5908" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 566.703750] env[60024]: DEBUG nova.network.neutron [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 566.744137] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a950b296-2038-4103-9401-9290be5614a0 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.760098] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97ab15dd-766f-4f85-b133-45394f6aaa10 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 566.774656] env[60024]: DEBUG nova.compute.provider_tree [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 566.786158] env[60024]: DEBUG nova.scheduler.client.report [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 566.810190] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.278s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 566.810605] env[60024]: DEBUG nova.compute.manager [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 566.827050] env[60024]: DEBUG nova.network.neutron [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 566.864222] env[60024]: DEBUG nova.compute.utils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 566.865766] env[60024]: DEBUG nova.compute.manager [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 566.866359] env[60024]: DEBUG nova.network.neutron [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 566.881872] env[60024]: DEBUG nova.compute.manager [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 566.983812] env[60024]: DEBUG nova.compute.manager [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 567.010108] env[60024]: DEBUG nova.virt.hardware [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 567.010348] env[60024]: DEBUG nova.virt.hardware [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 567.010501] env[60024]: DEBUG nova.virt.hardware [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 567.010736] env[60024]: DEBUG nova.virt.hardware [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 567.010879] env[60024]: DEBUG nova.virt.hardware [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 567.011030] env[60024]: DEBUG nova.virt.hardware [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 567.011240] env[60024]: DEBUG nova.virt.hardware [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 567.011390] env[60024]: DEBUG nova.virt.hardware [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 567.011710] env[60024]: DEBUG nova.virt.hardware [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 567.011949] env[60024]: DEBUG nova.virt.hardware [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 567.012155] env[60024]: DEBUG nova.virt.hardware [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 567.013045] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fb6adb3-08e6-4c68-b1a4-03cfc3668ae5 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 567.022766] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a38ad44b-6dc1-4ab4-aa8b-3d8b2bbb969f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 567.483918] env[60024]: DEBUG nova.policy [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '670367f892a9462f982808a51b5d890b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8daf8ee8cb0542f099584ca77665f732', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 567.901579] env[60024]: DEBUG nova.network.neutron [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Updating instance_info_cache with network_info: [{"id": "e709139c-f29a-48b4-88ea-ff5a533c2b9b", "address": "fa:16:3e:8f:a7:25", "network": {"id": "ce212b44-d471-4ec9-acb2-9df369ad358e", "bridge": "br-int", "label": "tempest-ImagesTestJSON-910957629-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4b3edaaa3fdc4f73b49b8e57e04b8fa0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "089a7624-43ba-4fce-bfc0-63e4bb7f9aeb", "external-id": "nsx-vlan-transportzone-218", "segmentation_id": 218, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape709139c-f2", "ovs_interfaceid": "e709139c-f29a-48b4-88ea-ff5a533c2b9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 567.927786] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Releasing lock "refresh_cache-8a0d9829-6759-4593-9230-459a546a5908" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 567.927786] env[60024]: DEBUG nova.compute.manager [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Instance network_info: |[{"id": "e709139c-f29a-48b4-88ea-ff5a533c2b9b", "address": "fa:16:3e:8f:a7:25", "network": {"id": "ce212b44-d471-4ec9-acb2-9df369ad358e", "bridge": "br-int", "label": "tempest-ImagesTestJSON-910957629-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4b3edaaa3fdc4f73b49b8e57e04b8fa0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "089a7624-43ba-4fce-bfc0-63e4bb7f9aeb", "external-id": "nsx-vlan-transportzone-218", "segmentation_id": 218, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape709139c-f2", "ovs_interfaceid": "e709139c-f29a-48b4-88ea-ff5a533c2b9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 567.928116] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8f:a7:25', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '089a7624-43ba-4fce-bfc0-63e4bb7f9aeb', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e709139c-f29a-48b4-88ea-ff5a533c2b9b', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 567.940758] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Creating folder: Project (4b3edaaa3fdc4f73b49b8e57e04b8fa0). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 567.940758] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-33e708f2-d9bb-48e9-b3b9-52f47aed5ce0 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 567.952706] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Created folder: Project (4b3edaaa3fdc4f73b49b8e57e04b8fa0) in parent group-v894073. [ 567.952873] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Creating folder: Instances. Parent ref: group-v894077. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 567.953525] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-73cf1ecf-e115-4671-9598-a27c4ab5ef88 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 567.965619] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Created folder: Instances in parent group-v894077. [ 567.965619] env[60024]: DEBUG oslo.service.loopingcall [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 567.965619] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 567.965619] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-7a41cf53-002a-4d40-9b50-b633d44d51dd {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 567.990095] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 567.990095] env[60024]: value = "task-4576205" [ 567.990095] env[60024]: _type = "Task" [ 567.990095] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 568.003555] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576205, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 568.224509] env[60024]: DEBUG nova.network.neutron [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Successfully created port: 4f2cfc88-49e8-4fca-87cb-b8e60d729b53 {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 568.503814] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576205, 'name': CreateVM_Task, 'duration_secs': 0.350438} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 568.506242] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 568.566255] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 568.566848] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 568.568138] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 568.568321] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-93616e48-83a9-489a-84b1-ee115552459e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 568.576857] env[60024]: DEBUG oslo_vmware.api [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Waiting for the task: (returnval){ [ 568.576857] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]524b629d-d0b8-93cb-474e-4b499ff96243" [ 568.576857] env[60024]: _type = "Task" [ 568.576857] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 568.586096] env[60024]: DEBUG oslo_vmware.api [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]524b629d-d0b8-93cb-474e-4b499ff96243, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 568.900547] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Acquiring lock "68c87b51-b90a-47cc-bec1-05f7c389fc14" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 568.901033] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Lock "68c87b51-b90a-47cc-bec1-05f7c389fc14" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 568.917339] env[60024]: DEBUG nova.compute.manager [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 568.987544] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 568.988014] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 568.990150] env[60024]: INFO nova.compute.claims [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 569.088149] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 569.088672] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 569.088916] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 569.169101] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquiring lock "7a4778b7-5ffc-4641-b968-d0304fd67ee0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 569.169397] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Lock "7a4778b7-5ffc-4641-b968-d0304fd67ee0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 569.189186] env[60024]: DEBUG nova.compute.manager [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 569.194293] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76431761-4f27-4d68-b3b4-43cc52a65669 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.204684] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3b3dcc1-aff2-4ba5-842a-f9e727d3820b {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.246625] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e7f8855-8cbb-4bc0-9963-3e780203ff17 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.257756] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-211359ce-14a2-4dc2-9764-cd3647f14221 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.265142] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 569.273612] env[60024]: DEBUG nova.compute.provider_tree [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 569.291331] env[60024]: DEBUG nova.scheduler.client.report [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 569.310082] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.322s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 569.310319] env[60024]: DEBUG nova.compute.manager [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 569.318614] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.051s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 569.318614] env[60024]: INFO nova.compute.claims [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 569.353752] env[60024]: DEBUG nova.compute.utils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 569.356691] env[60024]: DEBUG nova.compute.manager [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Not allocating networking since 'none' was specified. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 569.367057] env[60024]: DEBUG nova.compute.manager [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 569.384757] env[60024]: DEBUG nova.network.neutron [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Successfully updated port: 4f2cfc88-49e8-4fca-87cb-b8e60d729b53 {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 569.412029] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Acquiring lock "refresh_cache-c4400e80-4457-4a8a-8588-f594e5993cde" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 569.412029] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Acquired lock "refresh_cache-c4400e80-4457-4a8a-8588-f594e5993cde" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 569.412029] env[60024]: DEBUG nova.network.neutron [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 569.451388] env[60024]: DEBUG nova.compute.manager [req-ff10ab54-6fd8-4921-94c2-a71cba79ed04 req-5eacd8ec-a079-46db-8b43-06663078e552 service nova] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Received event network-vif-plugged-e709139c-f29a-48b4-88ea-ff5a533c2b9b {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 569.451649] env[60024]: DEBUG oslo_concurrency.lockutils [req-ff10ab54-6fd8-4921-94c2-a71cba79ed04 req-5eacd8ec-a079-46db-8b43-06663078e552 service nova] Acquiring lock "8a0d9829-6759-4593-9230-459a546a5908-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 569.451974] env[60024]: DEBUG oslo_concurrency.lockutils [req-ff10ab54-6fd8-4921-94c2-a71cba79ed04 req-5eacd8ec-a079-46db-8b43-06663078e552 service nova] Lock "8a0d9829-6759-4593-9230-459a546a5908-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 569.452186] env[60024]: DEBUG oslo_concurrency.lockutils [req-ff10ab54-6fd8-4921-94c2-a71cba79ed04 req-5eacd8ec-a079-46db-8b43-06663078e552 service nova] Lock "8a0d9829-6759-4593-9230-459a546a5908-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 569.454029] env[60024]: DEBUG nova.compute.manager [req-ff10ab54-6fd8-4921-94c2-a71cba79ed04 req-5eacd8ec-a079-46db-8b43-06663078e552 service nova] [instance: 8a0d9829-6759-4593-9230-459a546a5908] No waiting events found dispatching network-vif-plugged-e709139c-f29a-48b4-88ea-ff5a533c2b9b {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 569.454029] env[60024]: WARNING nova.compute.manager [req-ff10ab54-6fd8-4921-94c2-a71cba79ed04 req-5eacd8ec-a079-46db-8b43-06663078e552 service nova] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Received unexpected event network-vif-plugged-e709139c-f29a-48b4-88ea-ff5a533c2b9b for instance with vm_state building and task_state spawning. [ 569.502367] env[60024]: DEBUG nova.network.neutron [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 569.507112] env[60024]: DEBUG nova.compute.manager [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 569.530218] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c33c7e0e-7695-4e15-8fdc-81919c7a9e13 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.542111] env[60024]: DEBUG nova.virt.hardware [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 569.542111] env[60024]: DEBUG nova.virt.hardware [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 569.542320] env[60024]: DEBUG nova.virt.hardware [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 569.542355] env[60024]: DEBUG nova.virt.hardware [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 569.542488] env[60024]: DEBUG nova.virt.hardware [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 569.542638] env[60024]: DEBUG nova.virt.hardware [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 569.542889] env[60024]: DEBUG nova.virt.hardware [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 569.543085] env[60024]: DEBUG nova.virt.hardware [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 569.543265] env[60024]: DEBUG nova.virt.hardware [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 569.543565] env[60024]: DEBUG nova.virt.hardware [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 569.543603] env[60024]: DEBUG nova.virt.hardware [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 569.544480] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3d91cc8-3395-4b0d-9824-ed657257bb98 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.557255] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82e82fbb-a16f-40e5-bf92-a42bfdcac5eb {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.602748] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f472a7d6-a756-4728-872a-da5ac4b2c519 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.607615] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b1bca15-35d9-4871-be9f-d3049e85052f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.624800] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ce2ccbe-c226-4294-bc7c-cea516b7f9b0 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.631035] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Instance VIF info [] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 569.636626] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Creating folder: Project (7864d1883b764303a345385abda4d8ae). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 569.637072] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5fc3260d-b617-4973-ab75-616aee9cd342 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.655422] env[60024]: DEBUG nova.compute.provider_tree [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 569.658831] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Created folder: Project (7864d1883b764303a345385abda4d8ae) in parent group-v894073. [ 569.659288] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Creating folder: Instances. Parent ref: group-v894080. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 569.659288] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6c5a3b68-0d5f-4de7-84d4-436c22f9e7de {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.672584] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Created folder: Instances in parent group-v894080. [ 569.672584] env[60024]: DEBUG oslo.service.loopingcall [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 569.677597] env[60024]: DEBUG nova.scheduler.client.report [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 569.684930] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 569.685155] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ae6c5d52-328b-41c6-9048-0fd7b367298c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.701642] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.386s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 569.701918] env[60024]: DEBUG nova.compute.manager [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 569.713671] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 569.713671] env[60024]: value = "task-4576208" [ 569.713671] env[60024]: _type = "Task" [ 569.713671] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 569.731417] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576208, 'name': CreateVM_Task} progress is 6%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 569.743744] env[60024]: DEBUG nova.compute.utils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 569.747326] env[60024]: DEBUG nova.compute.manager [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 569.747667] env[60024]: DEBUG nova.network.neutron [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 569.763808] env[60024]: DEBUG nova.compute.manager [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 569.869534] env[60024]: DEBUG nova.compute.manager [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 569.904153] env[60024]: DEBUG nova.virt.hardware [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 569.904350] env[60024]: DEBUG nova.virt.hardware [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 569.904423] env[60024]: DEBUG nova.virt.hardware [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 569.904829] env[60024]: DEBUG nova.virt.hardware [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 569.904829] env[60024]: DEBUG nova.virt.hardware [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 569.904969] env[60024]: DEBUG nova.virt.hardware [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 569.905256] env[60024]: DEBUG nova.virt.hardware [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 569.905394] env[60024]: DEBUG nova.virt.hardware [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 569.905796] env[60024]: DEBUG nova.virt.hardware [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 569.906030] env[60024]: DEBUG nova.virt.hardware [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 569.906224] env[60024]: DEBUG nova.virt.hardware [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 569.907737] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9ca16c6-da9f-4023-89e4-45dd4035e907 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.920704] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53e9d28f-69e2-444d-a99c-43a5e6541462 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.068384] env[60024]: DEBUG nova.policy [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c06f3b2e0bd4459696b6724fa90f3809', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f0031e355e57421a8d48003a7eb717db', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 570.227904] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576208, 'name': CreateVM_Task, 'duration_secs': 0.279762} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 570.227904] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 570.230880] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 570.230880] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 570.230880] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 570.230880] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6705bbed-2b7f-44d4-80b6-35a1a768f870 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.238125] env[60024]: DEBUG oslo_vmware.api [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Waiting for the task: (returnval){ [ 570.238125] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]52ba3bf7-0596-2ca7-5588-456415fd0bbc" [ 570.238125] env[60024]: _type = "Task" [ 570.238125] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 570.247163] env[60024]: DEBUG oslo_vmware.api [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]52ba3bf7-0596-2ca7-5588-456415fd0bbc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 570.369507] env[60024]: DEBUG nova.network.neutron [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Updating instance_info_cache with network_info: [{"id": "4f2cfc88-49e8-4fca-87cb-b8e60d729b53", "address": "fa:16:3e:34:dd:35", "network": {"id": "2dc30c6e-f5d4-41f8-923f-52db38b510e8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.138", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a8236fdd83234f75a229055fe16f088d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3d31a554-a94c-4471-892f-f65aa87b8279", "external-id": "nsx-vlan-transportzone-241", "segmentation_id": 241, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4f2cfc88-49", "ovs_interfaceid": "4f2cfc88-49e8-4fca-87cb-b8e60d729b53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 570.386695] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Releasing lock "refresh_cache-c4400e80-4457-4a8a-8588-f594e5993cde" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 570.386695] env[60024]: DEBUG nova.compute.manager [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Instance network_info: |[{"id": "4f2cfc88-49e8-4fca-87cb-b8e60d729b53", "address": "fa:16:3e:34:dd:35", "network": {"id": "2dc30c6e-f5d4-41f8-923f-52db38b510e8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.138", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a8236fdd83234f75a229055fe16f088d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3d31a554-a94c-4471-892f-f65aa87b8279", "external-id": "nsx-vlan-transportzone-241", "segmentation_id": 241, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4f2cfc88-49", "ovs_interfaceid": "4f2cfc88-49e8-4fca-87cb-b8e60d729b53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 570.386895] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:34:dd:35', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3d31a554-a94c-4471-892f-f65aa87b8279', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4f2cfc88-49e8-4fca-87cb-b8e60d729b53', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 570.395576] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Creating folder: Project (8daf8ee8cb0542f099584ca77665f732). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 570.396417] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-df06d210-858a-425c-b7da-ae641e748e74 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.408932] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Created folder: Project (8daf8ee8cb0542f099584ca77665f732) in parent group-v894073. [ 570.411036] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Creating folder: Instances. Parent ref: group-v894083. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 570.411036] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-be9638bc-8f44-46cf-a98a-a2918bb5e6a6 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.421188] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Created folder: Instances in parent group-v894083. [ 570.423214] env[60024]: DEBUG oslo.service.loopingcall [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 570.423214] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 570.423214] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ce2daf84-bc0b-4f87-959c-f26eba124a35 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.448551] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 570.448551] env[60024]: value = "task-4576211" [ 570.448551] env[60024]: _type = "Task" [ 570.448551] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 570.460360] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576211, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 570.756556] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 570.756878] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 570.756878] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 570.964667] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576211, 'name': CreateVM_Task, 'duration_secs': 0.334669} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 570.964892] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 570.965725] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 570.965899] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 570.966475] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 570.967058] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-218bd1bd-d77c-4c06-a9b4-63daebdd89a2 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.974041] env[60024]: DEBUG oslo_vmware.api [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Waiting for the task: (returnval){ [ 570.974041] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]52ff0396-0163-040e-b29b-b843a0fd2ba8" [ 570.974041] env[60024]: _type = "Task" [ 570.974041] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 570.984641] env[60024]: DEBUG oslo_vmware.api [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]52ff0396-0163-040e-b29b-b843a0fd2ba8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 571.491930] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 571.491930] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 571.491930] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 571.534224] env[60024]: DEBUG nova.network.neutron [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Successfully created port: f15d0aba-4a9a-4e45-b532-0d699daccc5a {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 572.566482] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Acquiring lock "37916d26-1b5e-4991-83a2-ca5a5b00c2ac" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.566482] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Lock "37916d26-1b5e-4991-83a2-ca5a5b00c2ac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.581848] env[60024]: DEBUG nova.compute.manager [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 572.641124] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 572.641124] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 572.641890] env[60024]: INFO nova.compute.claims [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 572.832377] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1064a22b-53d4-459b-926f-d4ff6824dcb7 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.848313] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6af71852-3cce-403e-a555-8206913217a6 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.890262] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04abf4aa-a803-42e5-803c-765e95866de1 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.901257] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43072fd3-2d5f-4e2e-8206-229bd3d8e1aa {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.917430] env[60024]: DEBUG nova.compute.provider_tree [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 572.926384] env[60024]: DEBUG nova.scheduler.client.report [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 572.943030] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.302s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 572.943580] env[60024]: DEBUG nova.compute.manager [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 572.978679] env[60024]: DEBUG nova.compute.utils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 572.981667] env[60024]: DEBUG nova.compute.manager [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 572.981667] env[60024]: DEBUG nova.network.neutron [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 572.996160] env[60024]: DEBUG nova.compute.manager [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 573.082204] env[60024]: DEBUG nova.compute.manager [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 573.111260] env[60024]: DEBUG nova.virt.hardware [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 573.111260] env[60024]: DEBUG nova.virt.hardware [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 573.111260] env[60024]: DEBUG nova.virt.hardware [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 573.111420] env[60024]: DEBUG nova.virt.hardware [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 573.111420] env[60024]: DEBUG nova.virt.hardware [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 573.111420] env[60024]: DEBUG nova.virt.hardware [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 573.111420] env[60024]: DEBUG nova.virt.hardware [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 573.111971] env[60024]: DEBUG nova.virt.hardware [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 573.112469] env[60024]: DEBUG nova.virt.hardware [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 573.112788] env[60024]: DEBUG nova.virt.hardware [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 573.113137] env[60024]: DEBUG nova.virt.hardware [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 573.114243] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-683ff661-70d0-4a3a-98a4-14d77f831e19 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.120023] env[60024]: DEBUG nova.network.neutron [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Successfully updated port: f15d0aba-4a9a-4e45-b532-0d699daccc5a {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 573.129658] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-149def7f-7d76-4242-a3f0-380794cabcee {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.136142] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquiring lock "refresh_cache-7a4778b7-5ffc-4641-b968-d0304fd67ee0" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 573.136459] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquired lock "refresh_cache-7a4778b7-5ffc-4641-b968-d0304fd67ee0" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 573.136648] env[60024]: DEBUG nova.network.neutron [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 573.232504] env[60024]: DEBUG nova.network.neutron [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 573.241781] env[60024]: DEBUG nova.policy [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7a11dbc35f434cfab97abf7033b16758', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f500e2d8bd8b4db28dc4c1f088d12990', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 573.916076] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Acquiring lock "363f5261-d589-4f99-b7dd-ab8f16cefee3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 573.916752] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Lock "363f5261-d589-4f99-b7dd-ab8f16cefee3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 573.932933] env[60024]: DEBUG nova.network.neutron [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Updating instance_info_cache with network_info: [{"id": "f15d0aba-4a9a-4e45-b532-0d699daccc5a", "address": "fa:16:3e:80:49:1e", "network": {"id": "2dc30c6e-f5d4-41f8-923f-52db38b510e8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a8236fdd83234f75a229055fe16f088d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3d31a554-a94c-4471-892f-f65aa87b8279", "external-id": "nsx-vlan-transportzone-241", "segmentation_id": 241, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf15d0aba-4a", "ovs_interfaceid": "f15d0aba-4a9a-4e45-b532-0d699daccc5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 573.938256] env[60024]: DEBUG nova.compute.manager [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 573.947225] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Releasing lock "refresh_cache-7a4778b7-5ffc-4641-b968-d0304fd67ee0" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 573.947536] env[60024]: DEBUG nova.compute.manager [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Instance network_info: |[{"id": "f15d0aba-4a9a-4e45-b532-0d699daccc5a", "address": "fa:16:3e:80:49:1e", "network": {"id": "2dc30c6e-f5d4-41f8-923f-52db38b510e8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a8236fdd83234f75a229055fe16f088d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3d31a554-a94c-4471-892f-f65aa87b8279", "external-id": "nsx-vlan-transportzone-241", "segmentation_id": 241, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf15d0aba-4a", "ovs_interfaceid": "f15d0aba-4a9a-4e45-b532-0d699daccc5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 573.947916] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:80:49:1e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3d31a554-a94c-4471-892f-f65aa87b8279', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f15d0aba-4a9a-4e45-b532-0d699daccc5a', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 573.960678] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Creating folder: Project (f0031e355e57421a8d48003a7eb717db). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 573.962053] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-882f1db1-cec5-4419-ba04-f4d0dd93d8a6 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.981023] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Created folder: Project (f0031e355e57421a8d48003a7eb717db) in parent group-v894073. [ 573.981023] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Creating folder: Instances. Parent ref: group-v894086. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 573.981023] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-405ecdfb-9456-4b91-9b08-f6a1f3b0df10 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.995292] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Created folder: Instances in parent group-v894086. [ 573.995897] env[60024]: DEBUG oslo.service.loopingcall [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 573.996506] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 573.997202] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bac7d7ad-1397-4577-b808-9683f46e6175 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.029025] env[60024]: DEBUG nova.compute.manager [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Received event network-changed-e709139c-f29a-48b4-88ea-ff5a533c2b9b {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 574.029025] env[60024]: DEBUG nova.compute.manager [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Refreshing instance network info cache due to event network-changed-e709139c-f29a-48b4-88ea-ff5a533c2b9b. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 574.029025] env[60024]: DEBUG oslo_concurrency.lockutils [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] Acquiring lock "refresh_cache-8a0d9829-6759-4593-9230-459a546a5908" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 574.029025] env[60024]: DEBUG oslo_concurrency.lockutils [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] Acquired lock "refresh_cache-8a0d9829-6759-4593-9230-459a546a5908" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 574.029025] env[60024]: DEBUG nova.network.neutron [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Refreshing network info cache for port e709139c-f29a-48b4-88ea-ff5a533c2b9b {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 574.039194] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 574.039194] env[60024]: value = "task-4576214" [ 574.039194] env[60024]: _type = "Task" [ 574.039194] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 574.050818] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576214, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 574.054603] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 574.054704] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 574.057417] env[60024]: INFO nova.compute.claims [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 574.235130] env[60024]: DEBUG nova.network.neutron [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Successfully created port: 887beb35-0ce2-4f8e-9d80-36cfc959c052 {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 574.289671] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff2b9e9a-66d5-4887-a425-0a5fb36843ea {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.302374] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dec6f51a-82f2-4ac8-abec-99ee3d2a2a23 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.347462] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7403ca1a-5f45-430d-aedf-37d9285f6128 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.361232] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b799b423-67d7-4c95-a8a7-f07c001ed89b {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.378196] env[60024]: DEBUG nova.compute.provider_tree [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 574.391793] env[60024]: DEBUG nova.scheduler.client.report [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 574.418243] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.360s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 574.419565] env[60024]: DEBUG nova.compute.manager [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 574.463089] env[60024]: DEBUG nova.compute.utils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 574.468022] env[60024]: DEBUG nova.compute.manager [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 574.468022] env[60024]: DEBUG nova.network.neutron [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 574.474191] env[60024]: DEBUG nova.compute.manager [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 574.555797] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576214, 'name': CreateVM_Task} progress is 99%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 574.561768] env[60024]: DEBUG nova.compute.manager [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 574.589633] env[60024]: DEBUG nova.virt.hardware [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 574.589762] env[60024]: DEBUG nova.virt.hardware [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 574.589873] env[60024]: DEBUG nova.virt.hardware [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 574.590023] env[60024]: DEBUG nova.virt.hardware [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 574.590240] env[60024]: DEBUG nova.virt.hardware [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 574.590508] env[60024]: DEBUG nova.virt.hardware [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 574.590644] env[60024]: DEBUG nova.virt.hardware [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 574.590787] env[60024]: DEBUG nova.virt.hardware [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 574.590958] env[60024]: DEBUG nova.virt.hardware [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 574.591138] env[60024]: DEBUG nova.virt.hardware [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 574.591315] env[60024]: DEBUG nova.virt.hardware [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 574.592439] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-390f5c94-8c1b-470a-9f16-3009613f4f44 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.607728] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6628f5c3-8e52-4ddc-830a-9e977c88ab72 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.631169] env[60024]: DEBUG nova.policy [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '61ceb0491981460e907a6e5e77cb19ed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '216628e9ba21424994765f97d38f1fcc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 574.711274] env[60024]: DEBUG nova.network.neutron [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Updated VIF entry in instance network info cache for port e709139c-f29a-48b4-88ea-ff5a533c2b9b. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 574.711274] env[60024]: DEBUG nova.network.neutron [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Updating instance_info_cache with network_info: [{"id": "e709139c-f29a-48b4-88ea-ff5a533c2b9b", "address": "fa:16:3e:8f:a7:25", "network": {"id": "ce212b44-d471-4ec9-acb2-9df369ad358e", "bridge": "br-int", "label": "tempest-ImagesTestJSON-910957629-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4b3edaaa3fdc4f73b49b8e57e04b8fa0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "089a7624-43ba-4fce-bfc0-63e4bb7f9aeb", "external-id": "nsx-vlan-transportzone-218", "segmentation_id": 218, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape709139c-f2", "ovs_interfaceid": "e709139c-f29a-48b4-88ea-ff5a533c2b9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 574.723123] env[60024]: DEBUG oslo_concurrency.lockutils [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] Releasing lock "refresh_cache-8a0d9829-6759-4593-9230-459a546a5908" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 574.723262] env[60024]: DEBUG nova.compute.manager [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Received event network-vif-plugged-4f2cfc88-49e8-4fca-87cb-b8e60d729b53 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 574.724244] env[60024]: DEBUG oslo_concurrency.lockutils [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] Acquiring lock "c4400e80-4457-4a8a-8588-f594e5993cde-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 574.724244] env[60024]: DEBUG oslo_concurrency.lockutils [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] Lock "c4400e80-4457-4a8a-8588-f594e5993cde-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 574.724244] env[60024]: DEBUG oslo_concurrency.lockutils [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] Lock "c4400e80-4457-4a8a-8588-f594e5993cde-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 574.724244] env[60024]: DEBUG nova.compute.manager [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] No waiting events found dispatching network-vif-plugged-4f2cfc88-49e8-4fca-87cb-b8e60d729b53 {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 574.724393] env[60024]: WARNING nova.compute.manager [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Received unexpected event network-vif-plugged-4f2cfc88-49e8-4fca-87cb-b8e60d729b53 for instance with vm_state building and task_state spawning. [ 574.724393] env[60024]: DEBUG nova.compute.manager [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Received event network-changed-4f2cfc88-49e8-4fca-87cb-b8e60d729b53 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 574.724457] env[60024]: DEBUG nova.compute.manager [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Refreshing instance network info cache due to event network-changed-4f2cfc88-49e8-4fca-87cb-b8e60d729b53. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 574.724678] env[60024]: DEBUG oslo_concurrency.lockutils [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] Acquiring lock "refresh_cache-c4400e80-4457-4a8a-8588-f594e5993cde" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 574.724758] env[60024]: DEBUG oslo_concurrency.lockutils [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] Acquired lock "refresh_cache-c4400e80-4457-4a8a-8588-f594e5993cde" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 574.724895] env[60024]: DEBUG nova.network.neutron [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Refreshing network info cache for port 4f2cfc88-49e8-4fca-87cb-b8e60d729b53 {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 575.051752] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576214, 'name': CreateVM_Task} progress is 99%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 575.074446] env[60024]: DEBUG nova.network.neutron [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Successfully created port: ad0f2c95-be2b-4375-b296-5d5d3e581516 {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 575.351430] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 575.351833] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 575.352064] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Starting heal instance info cache {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 575.352199] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Rebuilding the list of instances to heal {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 575.373991] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 575.374188] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 575.374311] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 575.377103] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 575.377103] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 575.377103] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 575.377103] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 575.377103] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Didn't find any instances for network info cache update. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 575.384318] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 575.384614] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 575.385177] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 575.385398] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 575.385594] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 575.385783] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 575.385956] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60024) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 575.386133] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 575.401254] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 575.403039] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 575.403039] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 575.403039] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60024) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 575.406104] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af3aa1ff-6617-4438-93aa-c49d0fb68cb5 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.420091] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc472c06-c189-4c7e-8321-efc4f7c3955d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.438951] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82974a48-2591-40d9-9993-2426966ccf04 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.449454] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c66c69e6-911d-46e7-9b5f-5b2d9f031230 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.488320] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180698MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60024) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 575.488320] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 575.488320] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 575.552359] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576214, 'name': CreateVM_Task, 'duration_secs': 1.361008} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 575.552806] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 575.554462] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 575.555783] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 575.556123] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 575.556559] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e0cf7b25-e817-4c98-b451-05cd77239503 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.566220] env[60024]: DEBUG oslo_vmware.api [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Waiting for the task: (returnval){ [ 575.566220] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]52f8d4b7-72c6-3e4c-82c9-607226968647" [ 575.566220] env[60024]: _type = "Task" [ 575.566220] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 575.581868] env[60024]: DEBUG oslo_vmware.api [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]52f8d4b7-72c6-3e4c-82c9-607226968647, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 575.590952] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 15e44d1f-ae9b-4ff7-841c-90acc81cf38b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 575.591103] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 8a0d9829-6759-4593-9230-459a546a5908 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 575.591246] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance c4400e80-4457-4a8a-8588-f594e5993cde actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 575.591367] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 68c87b51-b90a-47cc-bec1-05f7c389fc14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 575.591487] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 7a4778b7-5ffc-4641-b968-d0304fd67ee0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 575.591639] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 37916d26-1b5e-4991-83a2-ca5a5b00c2ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 575.591825] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 363f5261-d589-4f99-b7dd-ab8f16cefee3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 575.592734] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 575.592734] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=100GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 575.595146] env[60024]: DEBUG nova.network.neutron [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Successfully created port: 3f56120b-0dc1-4db9-a8da-94a8fd753130 {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 575.726930] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee55afc4-e4ce-478d-a1c0-a63ad53c0e8a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.735991] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6eb29ce0-d339-4e1a-98d2-e4a344ae474d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.772752] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b3cf341-7148-48fe-9a60-1e09f0ce3f91 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.782457] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39a364e0-490b-438b-9dfd-38e3357e92a0 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.801631] env[60024]: DEBUG nova.compute.provider_tree [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 575.813507] env[60024]: DEBUG nova.scheduler.client.report [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 575.859652] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60024) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 575.860026] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.372s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 575.953246] env[60024]: DEBUG nova.network.neutron [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Updated VIF entry in instance network info cache for port 4f2cfc88-49e8-4fca-87cb-b8e60d729b53. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 575.953298] env[60024]: DEBUG nova.network.neutron [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Updating instance_info_cache with network_info: [{"id": "4f2cfc88-49e8-4fca-87cb-b8e60d729b53", "address": "fa:16:3e:34:dd:35", "network": {"id": "2dc30c6e-f5d4-41f8-923f-52db38b510e8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.138", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a8236fdd83234f75a229055fe16f088d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3d31a554-a94c-4471-892f-f65aa87b8279", "external-id": "nsx-vlan-transportzone-241", "segmentation_id": 241, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4f2cfc88-49", "ovs_interfaceid": "4f2cfc88-49e8-4fca-87cb-b8e60d729b53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 575.972261] env[60024]: DEBUG oslo_concurrency.lockutils [req-765268c6-7830-4360-b7ce-30e33de1b20a req-8f849349-a1a4-481f-974b-d48eaf8ab2a0 service nova] Releasing lock "refresh_cache-c4400e80-4457-4a8a-8588-f594e5993cde" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 576.084745] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 576.085025] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 576.085249] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 576.212016] env[60024]: DEBUG nova.network.neutron [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Successfully created port: b13af17d-15e6-4b21-b707-671f401eb815 {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 576.229329] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquiring lock "ce222a29-3611-45b3-9664-87ae2fb1b1b8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 576.230036] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Lock "ce222a29-3611-45b3-9664-87ae2fb1b1b8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 576.244217] env[60024]: DEBUG nova.compute.manager [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 576.337084] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 576.337084] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 576.338437] env[60024]: INFO nova.compute.claims [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 576.652977] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c846d92-1d81-4c91-a603-c9fdb13f38b0 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 576.661798] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89d14367-bfd9-4730-b05a-025d3c1097c9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 576.700210] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-915e612e-af73-4d72-b7b5-a3d7fe6d7f10 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 576.712815] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-457c1a06-8627-4537-b47f-afd171be5309 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 576.725101] env[60024]: DEBUG nova.compute.provider_tree [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 576.736400] env[60024]: DEBUG nova.scheduler.client.report [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 576.755276] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.418s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 576.756807] env[60024]: DEBUG nova.compute.manager [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 576.806327] env[60024]: DEBUG nova.compute.utils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 576.807275] env[60024]: DEBUG nova.compute.manager [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 576.807762] env[60024]: DEBUG nova.network.neutron [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 576.822059] env[60024]: DEBUG nova.compute.manager [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 576.872402] env[60024]: DEBUG nova.network.neutron [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Successfully updated port: 3f56120b-0dc1-4db9-a8da-94a8fd753130 {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 576.882164] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Acquiring lock "refresh_cache-363f5261-d589-4f99-b7dd-ab8f16cefee3" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 576.882317] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Acquired lock "refresh_cache-363f5261-d589-4f99-b7dd-ab8f16cefee3" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 576.882467] env[60024]: DEBUG nova.network.neutron [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 576.906809] env[60024]: DEBUG nova.compute.manager [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 576.950412] env[60024]: DEBUG nova.virt.hardware [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 576.950748] env[60024]: DEBUG nova.virt.hardware [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 576.950946] env[60024]: DEBUG nova.virt.hardware [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 576.951177] env[60024]: DEBUG nova.virt.hardware [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 576.951365] env[60024]: DEBUG nova.virt.hardware [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 576.951546] env[60024]: DEBUG nova.virt.hardware [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 576.951798] env[60024]: DEBUG nova.virt.hardware [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 576.952038] env[60024]: DEBUG nova.virt.hardware [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 576.952223] env[60024]: DEBUG nova.virt.hardware [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 576.952392] env[60024]: DEBUG nova.virt.hardware [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 576.952568] env[60024]: DEBUG nova.virt.hardware [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 576.953495] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4df44b4b-9669-47a3-b727-34b5499b9549 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 576.965121] env[60024]: DEBUG nova.policy [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f71b64aa4f71462bb5f6eb5b4083ee63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '730d218125d1484687ab1b68a1e73d2e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 576.969615] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8d9aae8-f478-43c9-8921-ed680a0b262f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 576.976390] env[60024]: DEBUG nova.network.neutron [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 577.337837] env[60024]: DEBUG nova.network.neutron [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Updating instance_info_cache with network_info: [{"id": "3f56120b-0dc1-4db9-a8da-94a8fd753130", "address": "fa:16:3e:49:00:ac", "network": {"id": "2dc30c6e-f5d4-41f8-923f-52db38b510e8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.37", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a8236fdd83234f75a229055fe16f088d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3d31a554-a94c-4471-892f-f65aa87b8279", "external-id": "nsx-vlan-transportzone-241", "segmentation_id": 241, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3f56120b-0d", "ovs_interfaceid": "3f56120b-0dc1-4db9-a8da-94a8fd753130", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 577.352835] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Releasing lock "refresh_cache-363f5261-d589-4f99-b7dd-ab8f16cefee3" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 577.352990] env[60024]: DEBUG nova.compute.manager [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Instance network_info: |[{"id": "3f56120b-0dc1-4db9-a8da-94a8fd753130", "address": "fa:16:3e:49:00:ac", "network": {"id": "2dc30c6e-f5d4-41f8-923f-52db38b510e8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.37", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a8236fdd83234f75a229055fe16f088d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3d31a554-a94c-4471-892f-f65aa87b8279", "external-id": "nsx-vlan-transportzone-241", "segmentation_id": 241, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3f56120b-0d", "ovs_interfaceid": "3f56120b-0dc1-4db9-a8da-94a8fd753130", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 577.353659] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:49:00:ac', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3d31a554-a94c-4471-892f-f65aa87b8279', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3f56120b-0dc1-4db9-a8da-94a8fd753130', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 577.361465] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Creating folder: Project (216628e9ba21424994765f97d38f1fcc). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 577.362126] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-eed1b52b-3bd3-42b8-af95-fe9ee6d90a23 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.375853] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Created folder: Project (216628e9ba21424994765f97d38f1fcc) in parent group-v894073. [ 577.376121] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Creating folder: Instances. Parent ref: group-v894089. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 577.376988] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-27778cc0-ae8f-42a1-8dd2-33ea6929caf1 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.390487] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Created folder: Instances in parent group-v894089. [ 577.391229] env[60024]: DEBUG oslo.service.loopingcall [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 577.391229] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 577.391365] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-308d0e2d-94ee-4f32-8de1-33049c36ef27 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.418151] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 577.418151] env[60024]: value = "task-4576217" [ 577.418151] env[60024]: _type = "Task" [ 577.418151] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 577.430160] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576217, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 577.902209] env[60024]: DEBUG nova.network.neutron [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Successfully created port: 3e6e8d19-b28b-4fe5-bb41-b3cada986c2f {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 577.928458] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576217, 'name': CreateVM_Task, 'duration_secs': 0.333661} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 577.928704] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 577.929431] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 577.929602] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 577.929931] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 577.930216] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-93344439-42a0-45de-82ae-30b24672bb1d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.943024] env[60024]: DEBUG oslo_vmware.api [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Waiting for the task: (returnval){ [ 577.943024] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]520c3430-53cd-188e-c865-78b1d5b6a9c5" [ 577.943024] env[60024]: _type = "Task" [ 577.943024] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 577.951765] env[60024]: DEBUG oslo_vmware.api [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]520c3430-53cd-188e-c865-78b1d5b6a9c5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 578.457604] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 578.457604] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 578.457914] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 578.518982] env[60024]: DEBUG nova.network.neutron [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Successfully updated port: 887beb35-0ce2-4f8e-9d80-36cfc959c052 {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 578.818956] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Acquiring lock "9214a18f-c22d-4e24-980e-7241a2b993bd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 578.818956] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Lock "9214a18f-c22d-4e24-980e-7241a2b993bd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 578.832271] env[60024]: DEBUG nova.compute.manager [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 578.888861] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 578.889325] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 578.893481] env[60024]: INFO nova.compute.claims [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 579.123480] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27674f5c-5b96-4e35-9757-4de141cc6834 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 579.133275] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be79bf86-a6d4-4d08-adc5-faeeb1b60686 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 579.175677] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffe52915-94cf-4b76-b509-446658a253db {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 579.187336] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-056133d6-b3ce-4325-96a1-3a7aed74cd9e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 579.203341] env[60024]: DEBUG nova.compute.provider_tree [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 579.214018] env[60024]: DEBUG nova.scheduler.client.report [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 579.232093] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.343s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 579.232720] env[60024]: DEBUG nova.compute.manager [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 579.276319] env[60024]: DEBUG nova.compute.utils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 579.278547] env[60024]: DEBUG nova.compute.manager [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 579.278772] env[60024]: DEBUG nova.network.neutron [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 579.292800] env[60024]: DEBUG nova.compute.manager [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 579.383027] env[60024]: DEBUG nova.compute.manager [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 579.405233] env[60024]: DEBUG nova.policy [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f525273b7a5e4b9d836ee998eefc8ee2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dc17a2829e0b4b7298ae746628595053', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 579.416362] env[60024]: DEBUG nova.virt.hardware [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 579.416660] env[60024]: DEBUG nova.virt.hardware [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 579.417532] env[60024]: DEBUG nova.virt.hardware [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 579.417722] env[60024]: DEBUG nova.virt.hardware [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 579.417895] env[60024]: DEBUG nova.virt.hardware [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 579.418091] env[60024]: DEBUG nova.virt.hardware [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 579.418326] env[60024]: DEBUG nova.virt.hardware [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 579.418491] env[60024]: DEBUG nova.virt.hardware [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 579.418945] env[60024]: DEBUG nova.virt.hardware [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 579.418945] env[60024]: DEBUG nova.virt.hardware [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 579.419058] env[60024]: DEBUG nova.virt.hardware [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 579.420200] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d307a1c7-f9ae-49b3-8dd7-b7102427778a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 579.430521] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40a4c603-aa74-4b1d-a3cb-0ad12ab6c6d1 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 580.083509] env[60024]: DEBUG nova.network.neutron [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Successfully created port: a863c9bd-8117-454c-988f-693c41aea068 {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 580.209254] env[60024]: DEBUG nova.network.neutron [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Successfully updated port: 3e6e8d19-b28b-4fe5-bb41-b3cada986c2f {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 580.223058] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquiring lock "refresh_cache-ce222a29-3611-45b3-9664-87ae2fb1b1b8" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 580.223058] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquired lock "refresh_cache-ce222a29-3611-45b3-9664-87ae2fb1b1b8" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 580.223058] env[60024]: DEBUG nova.network.neutron [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 580.340328] env[60024]: DEBUG nova.network.neutron [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 580.402713] env[60024]: DEBUG nova.compute.manager [req-50825ba6-9d0d-48e2-ad6b-e4d19411ca7b req-613022db-b347-4296-b5ac-60b7f7fc5a32 service nova] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Received event network-vif-plugged-f15d0aba-4a9a-4e45-b532-0d699daccc5a {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 580.403286] env[60024]: DEBUG oslo_concurrency.lockutils [req-50825ba6-9d0d-48e2-ad6b-e4d19411ca7b req-613022db-b347-4296-b5ac-60b7f7fc5a32 service nova] Acquiring lock "7a4778b7-5ffc-4641-b968-d0304fd67ee0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 580.403760] env[60024]: DEBUG oslo_concurrency.lockutils [req-50825ba6-9d0d-48e2-ad6b-e4d19411ca7b req-613022db-b347-4296-b5ac-60b7f7fc5a32 service nova] Lock "7a4778b7-5ffc-4641-b968-d0304fd67ee0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 580.404986] env[60024]: DEBUG oslo_concurrency.lockutils [req-50825ba6-9d0d-48e2-ad6b-e4d19411ca7b req-613022db-b347-4296-b5ac-60b7f7fc5a32 service nova] Lock "7a4778b7-5ffc-4641-b968-d0304fd67ee0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 580.405387] env[60024]: DEBUG nova.compute.manager [req-50825ba6-9d0d-48e2-ad6b-e4d19411ca7b req-613022db-b347-4296-b5ac-60b7f7fc5a32 service nova] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] No waiting events found dispatching network-vif-plugged-f15d0aba-4a9a-4e45-b532-0d699daccc5a {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 580.405920] env[60024]: WARNING nova.compute.manager [req-50825ba6-9d0d-48e2-ad6b-e4d19411ca7b req-613022db-b347-4296-b5ac-60b7f7fc5a32 service nova] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Received unexpected event network-vif-plugged-f15d0aba-4a9a-4e45-b532-0d699daccc5a for instance with vm_state building and task_state spawning. [ 580.407967] env[60024]: DEBUG nova.compute.manager [req-50825ba6-9d0d-48e2-ad6b-e4d19411ca7b req-613022db-b347-4296-b5ac-60b7f7fc5a32 service nova] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Received event network-changed-f15d0aba-4a9a-4e45-b532-0d699daccc5a {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 580.407967] env[60024]: DEBUG nova.compute.manager [req-50825ba6-9d0d-48e2-ad6b-e4d19411ca7b req-613022db-b347-4296-b5ac-60b7f7fc5a32 service nova] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Refreshing instance network info cache due to event network-changed-f15d0aba-4a9a-4e45-b532-0d699daccc5a. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 580.407967] env[60024]: DEBUG oslo_concurrency.lockutils [req-50825ba6-9d0d-48e2-ad6b-e4d19411ca7b req-613022db-b347-4296-b5ac-60b7f7fc5a32 service nova] Acquiring lock "refresh_cache-7a4778b7-5ffc-4641-b968-d0304fd67ee0" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 580.407967] env[60024]: DEBUG oslo_concurrency.lockutils [req-50825ba6-9d0d-48e2-ad6b-e4d19411ca7b req-613022db-b347-4296-b5ac-60b7f7fc5a32 service nova] Acquired lock "refresh_cache-7a4778b7-5ffc-4641-b968-d0304fd67ee0" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 580.407967] env[60024]: DEBUG nova.network.neutron [req-50825ba6-9d0d-48e2-ad6b-e4d19411ca7b req-613022db-b347-4296-b5ac-60b7f7fc5a32 service nova] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Refreshing network info cache for port f15d0aba-4a9a-4e45-b532-0d699daccc5a {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 580.769249] env[60024]: DEBUG nova.network.neutron [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Successfully updated port: ad0f2c95-be2b-4375-b296-5d5d3e581516 {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 581.322869] env[60024]: DEBUG nova.network.neutron [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Updating instance_info_cache with network_info: [{"id": "3e6e8d19-b28b-4fe5-bb41-b3cada986c2f", "address": "fa:16:3e:68:5a:7f", "network": {"id": "271e6595-cea8-4029-8fab-8f17d8eac2ed", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-118947530-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "730d218125d1484687ab1b68a1e73d2e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c8459aaf-d6a8-46fb-ad14-464ac3104695", "external-id": "nsx-vlan-transportzone-46", "segmentation_id": 46, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3e6e8d19-b2", "ovs_interfaceid": "3e6e8d19-b28b-4fe5-bb41-b3cada986c2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 581.345571] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Releasing lock "refresh_cache-ce222a29-3611-45b3-9664-87ae2fb1b1b8" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 581.346035] env[60024]: DEBUG nova.compute.manager [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Instance network_info: |[{"id": "3e6e8d19-b28b-4fe5-bb41-b3cada986c2f", "address": "fa:16:3e:68:5a:7f", "network": {"id": "271e6595-cea8-4029-8fab-8f17d8eac2ed", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-118947530-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "730d218125d1484687ab1b68a1e73d2e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c8459aaf-d6a8-46fb-ad14-464ac3104695", "external-id": "nsx-vlan-transportzone-46", "segmentation_id": 46, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3e6e8d19-b2", "ovs_interfaceid": "3e6e8d19-b28b-4fe5-bb41-b3cada986c2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 581.346561] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:68:5a:7f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c8459aaf-d6a8-46fb-ad14-464ac3104695', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3e6e8d19-b28b-4fe5-bb41-b3cada986c2f', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 581.359167] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Creating folder: Project (730d218125d1484687ab1b68a1e73d2e). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 581.359167] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f6aa4a47-ca1f-4def-a1f6-f5ae1743cb5c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 581.371894] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Created folder: Project (730d218125d1484687ab1b68a1e73d2e) in parent group-v894073. [ 581.372092] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Creating folder: Instances. Parent ref: group-v894092. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 581.372354] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8425bc97-3efb-44b3-9ee6-f6540d11daf0 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 581.390168] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Created folder: Instances in parent group-v894092. [ 581.390865] env[60024]: DEBUG oslo.service.loopingcall [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 581.390865] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 581.391081] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ab5f94be-ccc5-4cb8-8c76-5f6441cf17d9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 581.415451] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 581.415451] env[60024]: value = "task-4576220" [ 581.415451] env[60024]: _type = "Task" [ 581.415451] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 581.431611] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576220, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 581.561789] env[60024]: DEBUG nova.network.neutron [req-50825ba6-9d0d-48e2-ad6b-e4d19411ca7b req-613022db-b347-4296-b5ac-60b7f7fc5a32 service nova] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Updated VIF entry in instance network info cache for port f15d0aba-4a9a-4e45-b532-0d699daccc5a. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 581.562152] env[60024]: DEBUG nova.network.neutron [req-50825ba6-9d0d-48e2-ad6b-e4d19411ca7b req-613022db-b347-4296-b5ac-60b7f7fc5a32 service nova] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Updating instance_info_cache with network_info: [{"id": "f15d0aba-4a9a-4e45-b532-0d699daccc5a", "address": "fa:16:3e:80:49:1e", "network": {"id": "2dc30c6e-f5d4-41f8-923f-52db38b510e8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.73", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a8236fdd83234f75a229055fe16f088d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3d31a554-a94c-4471-892f-f65aa87b8279", "external-id": "nsx-vlan-transportzone-241", "segmentation_id": 241, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf15d0aba-4a", "ovs_interfaceid": "f15d0aba-4a9a-4e45-b532-0d699daccc5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 581.575319] env[60024]: DEBUG oslo_concurrency.lockutils [req-50825ba6-9d0d-48e2-ad6b-e4d19411ca7b req-613022db-b347-4296-b5ac-60b7f7fc5a32 service nova] Releasing lock "refresh_cache-7a4778b7-5ffc-4641-b968-d0304fd67ee0" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 581.693332] env[60024]: DEBUG nova.compute.manager [req-163899c0-508a-4f10-a84d-c851bcf3aa2f req-17e05178-12c3-4b86-9f99-ff58532c6d3a service nova] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Received event network-vif-plugged-3f56120b-0dc1-4db9-a8da-94a8fd753130 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 581.695445] env[60024]: DEBUG oslo_concurrency.lockutils [req-163899c0-508a-4f10-a84d-c851bcf3aa2f req-17e05178-12c3-4b86-9f99-ff58532c6d3a service nova] Acquiring lock "363f5261-d589-4f99-b7dd-ab8f16cefee3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 581.695746] env[60024]: DEBUG oslo_concurrency.lockutils [req-163899c0-508a-4f10-a84d-c851bcf3aa2f req-17e05178-12c3-4b86-9f99-ff58532c6d3a service nova] Lock "363f5261-d589-4f99-b7dd-ab8f16cefee3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 581.696449] env[60024]: DEBUG oslo_concurrency.lockutils [req-163899c0-508a-4f10-a84d-c851bcf3aa2f req-17e05178-12c3-4b86-9f99-ff58532c6d3a service nova] Lock "363f5261-d589-4f99-b7dd-ab8f16cefee3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 581.696782] env[60024]: DEBUG nova.compute.manager [req-163899c0-508a-4f10-a84d-c851bcf3aa2f req-17e05178-12c3-4b86-9f99-ff58532c6d3a service nova] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] No waiting events found dispatching network-vif-plugged-3f56120b-0dc1-4db9-a8da-94a8fd753130 {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 581.697448] env[60024]: WARNING nova.compute.manager [req-163899c0-508a-4f10-a84d-c851bcf3aa2f req-17e05178-12c3-4b86-9f99-ff58532c6d3a service nova] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Received unexpected event network-vif-plugged-3f56120b-0dc1-4db9-a8da-94a8fd753130 for instance with vm_state building and task_state spawning. [ 581.728611] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquiring lock "3ab1b905-cd6f-4d2b-a244-f85e56f796d3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 581.729282] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Lock "3ab1b905-cd6f-4d2b-a244-f85e56f796d3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 581.745419] env[60024]: DEBUG nova.compute.manager [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 581.802088] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 581.802287] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 581.804912] env[60024]: INFO nova.compute.claims [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 581.930491] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576220, 'name': CreateVM_Task, 'duration_secs': 0.308008} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 581.933346] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 581.934691] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 581.934891] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 581.935275] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 581.937870] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-37425cad-5667-4224-97fd-33530567d5cd {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 581.941862] env[60024]: DEBUG oslo_vmware.api [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Waiting for the task: (returnval){ [ 581.941862] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]5225dc00-6d3e-c539-7789-87a86d859051" [ 581.941862] env[60024]: _type = "Task" [ 581.941862] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 581.953417] env[60024]: DEBUG oslo_vmware.api [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]5225dc00-6d3e-c539-7789-87a86d859051, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 582.064018] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2d82a6a-e0dd-4998-85ad-c3a60a7b1b1a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.071901] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50b45764-5883-4dcb-a2c1-d9201c1dc2d7 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.106690] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9280a664-54b8-4b51-bfb7-3db32b41ffde {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.115591] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec66f9f3-55d8-45fa-b093-4844e231c03c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.133176] env[60024]: DEBUG nova.compute.provider_tree [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 582.135593] env[60024]: DEBUG nova.network.neutron [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Successfully updated port: a863c9bd-8117-454c-988f-693c41aea068 {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 582.141986] env[60024]: DEBUG nova.scheduler.client.report [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 582.146108] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Acquiring lock "refresh_cache-9214a18f-c22d-4e24-980e-7241a2b993bd" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 582.146310] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Acquired lock "refresh_cache-9214a18f-c22d-4e24-980e-7241a2b993bd" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 582.146397] env[60024]: DEBUG nova.network.neutron [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 582.156603] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.354s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 582.158979] env[60024]: DEBUG nova.compute.manager [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 582.202190] env[60024]: DEBUG nova.compute.utils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 582.203729] env[60024]: DEBUG nova.compute.manager [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 582.204068] env[60024]: DEBUG nova.network.neutron [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 582.222033] env[60024]: DEBUG nova.compute.manager [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 582.311961] env[60024]: DEBUG nova.compute.manager [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 582.343805] env[60024]: DEBUG nova.virt.hardware [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 582.344194] env[60024]: DEBUG nova.virt.hardware [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 582.344235] env[60024]: DEBUG nova.virt.hardware [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 582.344398] env[60024]: DEBUG nova.virt.hardware [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 582.344583] env[60024]: DEBUG nova.virt.hardware [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 582.344680] env[60024]: DEBUG nova.virt.hardware [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 582.344891] env[60024]: DEBUG nova.virt.hardware [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 582.345057] env[60024]: DEBUG nova.virt.hardware [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 582.345224] env[60024]: DEBUG nova.virt.hardware [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 582.345384] env[60024]: DEBUG nova.virt.hardware [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 582.345594] env[60024]: DEBUG nova.virt.hardware [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 582.346506] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-390bdcac-e3d3-45b1-89d1-f03770ba4707 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.356643] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcec8cd6-6228-48f1-97ae-190516ad9b79 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.457652] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 582.458605] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 582.458605] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 582.468344] env[60024]: DEBUG nova.network.neutron [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 582.551392] env[60024]: DEBUG nova.policy [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f71b64aa4f71462bb5f6eb5b4083ee63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '730d218125d1484687ab1b68a1e73d2e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 582.828221] env[60024]: DEBUG nova.network.neutron [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Successfully updated port: b13af17d-15e6-4b21-b707-671f401eb815 {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 582.839222] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Acquiring lock "refresh_cache-37916d26-1b5e-4991-83a2-ca5a5b00c2ac" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 582.839465] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Acquired lock "refresh_cache-37916d26-1b5e-4991-83a2-ca5a5b00c2ac" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 582.839688] env[60024]: DEBUG nova.network.neutron [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 582.925658] env[60024]: DEBUG nova.network.neutron [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 583.051673] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Acquiring lock "036d6de2-f69b-4714-b89e-9c4307253675" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 583.051875] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Lock "036d6de2-f69b-4714-b89e-9c4307253675" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 583.130949] env[60024]: DEBUG nova.network.neutron [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Updating instance_info_cache with network_info: [{"id": "a863c9bd-8117-454c-988f-693c41aea068", "address": "fa:16:3e:bc:e0:68", "network": {"id": "4867f11b-b7b6-404d-a2a0-faa235d1d483", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-335371632-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dc17a2829e0b4b7298ae746628595053", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c8ee8640-3787-4c27-9581-962ddb2be7e5", "external-id": "nsx-vlan-transportzone-224", "segmentation_id": 224, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa863c9bd-81", "ovs_interfaceid": "a863c9bd-8117-454c-988f-693c41aea068", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 583.145121] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Releasing lock "refresh_cache-9214a18f-c22d-4e24-980e-7241a2b993bd" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 583.145263] env[60024]: DEBUG nova.compute.manager [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Instance network_info: |[{"id": "a863c9bd-8117-454c-988f-693c41aea068", "address": "fa:16:3e:bc:e0:68", "network": {"id": "4867f11b-b7b6-404d-a2a0-faa235d1d483", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-335371632-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dc17a2829e0b4b7298ae746628595053", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c8ee8640-3787-4c27-9581-962ddb2be7e5", "external-id": "nsx-vlan-transportzone-224", "segmentation_id": 224, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa863c9bd-81", "ovs_interfaceid": "a863c9bd-8117-454c-988f-693c41aea068", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 583.145263] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:bc:e0:68', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c8ee8640-3787-4c27-9581-962ddb2be7e5', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a863c9bd-8117-454c-988f-693c41aea068', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 583.153263] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Creating folder: Project (dc17a2829e0b4b7298ae746628595053). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 583.153857] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e91340cc-0b7c-4fc2-86ea-19fbad7961bd {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.166824] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Created folder: Project (dc17a2829e0b4b7298ae746628595053) in parent group-v894073. [ 583.167044] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Creating folder: Instances. Parent ref: group-v894095. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 583.167340] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a2fe9bd4-0ea3-4561-bf97-b074d051f220 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.178367] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Created folder: Instances in parent group-v894095. [ 583.178749] env[60024]: DEBUG oslo.service.loopingcall [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 583.179096] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 583.179096] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-509487a2-ef84-4b75-bbe9-5999f25d3c05 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.206324] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 583.206324] env[60024]: value = "task-4576223" [ 583.206324] env[60024]: _type = "Task" [ 583.206324] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 583.215638] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576223, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 583.458771] env[60024]: DEBUG nova.network.neutron [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Successfully created port: 3b332be1-bb9d-4c3a-8158-7aa801f480ae {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 583.719514] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576223, 'name': CreateVM_Task, 'duration_secs': 0.333428} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 583.719641] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 583.720574] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 583.720893] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 583.721374] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 583.721676] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6618f02d-32d2-4163-b471-43c6cd7a1248 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.730020] env[60024]: DEBUG oslo_vmware.api [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Waiting for the task: (returnval){ [ 583.730020] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]521e0c4c-69bd-2fd8-71ec-bb79f2d13ede" [ 583.730020] env[60024]: _type = "Task" [ 583.730020] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 583.738386] env[60024]: DEBUG oslo_vmware.api [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]521e0c4c-69bd-2fd8-71ec-bb79f2d13ede, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 583.754515] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Acquiring lock "5888cc9f-7341-4f9c-a93c-dd5ec95f7369" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 583.754892] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Lock "5888cc9f-7341-4f9c-a93c-dd5ec95f7369" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.243185] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 584.243476] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 584.243702] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 584.259600] env[60024]: DEBUG nova.compute.manager [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Received event network-vif-plugged-887beb35-0ce2-4f8e-9d80-36cfc959c052 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 584.259986] env[60024]: DEBUG oslo_concurrency.lockutils [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] Acquiring lock "37916d26-1b5e-4991-83a2-ca5a5b00c2ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 584.260065] env[60024]: DEBUG oslo_concurrency.lockutils [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] Lock "37916d26-1b5e-4991-83a2-ca5a5b00c2ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.260204] env[60024]: DEBUG oslo_concurrency.lockutils [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] Lock "37916d26-1b5e-4991-83a2-ca5a5b00c2ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.260370] env[60024]: DEBUG nova.compute.manager [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] No waiting events found dispatching network-vif-plugged-887beb35-0ce2-4f8e-9d80-36cfc959c052 {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 584.260528] env[60024]: WARNING nova.compute.manager [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Received unexpected event network-vif-plugged-887beb35-0ce2-4f8e-9d80-36cfc959c052 for instance with vm_state building and task_state spawning. [ 584.260709] env[60024]: DEBUG nova.compute.manager [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Received event network-changed-887beb35-0ce2-4f8e-9d80-36cfc959c052 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 584.260902] env[60024]: DEBUG nova.compute.manager [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Refreshing instance network info cache due to event network-changed-887beb35-0ce2-4f8e-9d80-36cfc959c052. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 584.261063] env[60024]: DEBUG oslo_concurrency.lockutils [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] Acquiring lock "refresh_cache-37916d26-1b5e-4991-83a2-ca5a5b00c2ac" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 585.113264] env[60024]: DEBUG nova.compute.manager [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Received event network-changed-3f56120b-0dc1-4db9-a8da-94a8fd753130 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 585.113264] env[60024]: DEBUG nova.compute.manager [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Refreshing instance network info cache due to event network-changed-3f56120b-0dc1-4db9-a8da-94a8fd753130. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 585.113572] env[60024]: DEBUG oslo_concurrency.lockutils [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] Acquiring lock "refresh_cache-363f5261-d589-4f99-b7dd-ab8f16cefee3" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 585.113572] env[60024]: DEBUG oslo_concurrency.lockutils [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] Acquired lock "refresh_cache-363f5261-d589-4f99-b7dd-ab8f16cefee3" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 585.113641] env[60024]: DEBUG nova.network.neutron [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Refreshing network info cache for port 3f56120b-0dc1-4db9-a8da-94a8fd753130 {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 585.270896] env[60024]: DEBUG nova.network.neutron [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Updating instance_info_cache with network_info: [{"id": "887beb35-0ce2-4f8e-9d80-36cfc959c052", "address": "fa:16:3e:5e:d3:6b", "network": {"id": "0f897b67-cdb2-43dd-91fe-7f79bc7b3d46", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67423902", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f500e2d8bd8b4db28dc4c1f088d12990", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "983826cf-6390-4ec6-bf97-30a1060947fc", "external-id": "nsx-vlan-transportzone-367", "segmentation_id": 367, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap887beb35-0c", "ovs_interfaceid": "887beb35-0ce2-4f8e-9d80-36cfc959c052", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ad0f2c95-be2b-4375-b296-5d5d3e581516", "address": "fa:16:3e:fc:2e:78", "network": {"id": "71d2a65c-7b37-48fb-a80c-839fc99a8a26", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2111383669", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "f500e2d8bd8b4db28dc4c1f088d12990", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d2bf584a-b4a3-4e7a-b0b7-eb8a2bc5a11d", "external-id": "nsx-vlan-transportzone-286", "segmentation_id": 286, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapad0f2c95-be", "ovs_interfaceid": "ad0f2c95-be2b-4375-b296-5d5d3e581516", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b13af17d-15e6-4b21-b707-671f401eb815", "address": "fa:16:3e:6f:25:83", "network": {"id": "0f897b67-cdb2-43dd-91fe-7f79bc7b3d46", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67423902", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.116", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f500e2d8bd8b4db28dc4c1f088d12990", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "983826cf-6390-4ec6-bf97-30a1060947fc", "external-id": "nsx-vlan-transportzone-367", "segmentation_id": 367, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb13af17d-15", "ovs_interfaceid": "b13af17d-15e6-4b21-b707-671f401eb815", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 585.289094] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Releasing lock "refresh_cache-37916d26-1b5e-4991-83a2-ca5a5b00c2ac" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 585.289512] env[60024]: DEBUG nova.compute.manager [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Instance network_info: |[{"id": "887beb35-0ce2-4f8e-9d80-36cfc959c052", "address": "fa:16:3e:5e:d3:6b", "network": {"id": "0f897b67-cdb2-43dd-91fe-7f79bc7b3d46", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67423902", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f500e2d8bd8b4db28dc4c1f088d12990", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "983826cf-6390-4ec6-bf97-30a1060947fc", "external-id": "nsx-vlan-transportzone-367", "segmentation_id": 367, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap887beb35-0c", "ovs_interfaceid": "887beb35-0ce2-4f8e-9d80-36cfc959c052", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ad0f2c95-be2b-4375-b296-5d5d3e581516", "address": "fa:16:3e:fc:2e:78", "network": {"id": "71d2a65c-7b37-48fb-a80c-839fc99a8a26", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2111383669", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "f500e2d8bd8b4db28dc4c1f088d12990", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d2bf584a-b4a3-4e7a-b0b7-eb8a2bc5a11d", "external-id": "nsx-vlan-transportzone-286", "segmentation_id": 286, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapad0f2c95-be", "ovs_interfaceid": "ad0f2c95-be2b-4375-b296-5d5d3e581516", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b13af17d-15e6-4b21-b707-671f401eb815", "address": "fa:16:3e:6f:25:83", "network": {"id": "0f897b67-cdb2-43dd-91fe-7f79bc7b3d46", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67423902", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.116", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f500e2d8bd8b4db28dc4c1f088d12990", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "983826cf-6390-4ec6-bf97-30a1060947fc", "external-id": "nsx-vlan-transportzone-367", "segmentation_id": 367, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb13af17d-15", "ovs_interfaceid": "b13af17d-15e6-4b21-b707-671f401eb815", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 585.289846] env[60024]: DEBUG oslo_concurrency.lockutils [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] Acquired lock "refresh_cache-37916d26-1b5e-4991-83a2-ca5a5b00c2ac" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 585.290009] env[60024]: DEBUG nova.network.neutron [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Refreshing network info cache for port 887beb35-0ce2-4f8e-9d80-36cfc959c052 {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 585.295029] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5e:d3:6b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '983826cf-6390-4ec6-bf97-30a1060947fc', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '887beb35-0ce2-4f8e-9d80-36cfc959c052', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:fc:2e:78', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd2bf584a-b4a3-4e7a-b0b7-eb8a2bc5a11d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ad0f2c95-be2b-4375-b296-5d5d3e581516', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:6f:25:83', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '983826cf-6390-4ec6-bf97-30a1060947fc', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b13af17d-15e6-4b21-b707-671f401eb815', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 585.305310] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Creating folder: Project (f500e2d8bd8b4db28dc4c1f088d12990). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 585.309552] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-80082ccd-d175-4f15-a0da-b51d85d04958 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.327605] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Created folder: Project (f500e2d8bd8b4db28dc4c1f088d12990) in parent group-v894073. [ 585.328111] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Creating folder: Instances. Parent ref: group-v894098. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 585.328450] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9f5f6d59-69a3-4a03-bed4-5e98c57834db {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.342334] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Created folder: Instances in parent group-v894098. [ 585.342758] env[60024]: DEBUG oslo.service.loopingcall [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 585.343117] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 585.343573] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-52a003da-2f89-484f-950a-cadbe28d2042 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.370680] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 585.370680] env[60024]: value = "task-4576226" [ 585.370680] env[60024]: _type = "Task" [ 585.370680] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 585.382131] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576226, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 585.631202] env[60024]: DEBUG nova.network.neutron [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Updated VIF entry in instance network info cache for port 3f56120b-0dc1-4db9-a8da-94a8fd753130. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 585.631624] env[60024]: DEBUG nova.network.neutron [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Updating instance_info_cache with network_info: [{"id": "3f56120b-0dc1-4db9-a8da-94a8fd753130", "address": "fa:16:3e:49:00:ac", "network": {"id": "2dc30c6e-f5d4-41f8-923f-52db38b510e8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.37", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a8236fdd83234f75a229055fe16f088d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3d31a554-a94c-4471-892f-f65aa87b8279", "external-id": "nsx-vlan-transportzone-241", "segmentation_id": 241, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3f56120b-0d", "ovs_interfaceid": "3f56120b-0dc1-4db9-a8da-94a8fd753130", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 585.646773] env[60024]: DEBUG oslo_concurrency.lockutils [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] Releasing lock "refresh_cache-363f5261-d589-4f99-b7dd-ab8f16cefee3" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 585.646773] env[60024]: DEBUG nova.compute.manager [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Received event network-vif-plugged-a863c9bd-8117-454c-988f-693c41aea068 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 585.646773] env[60024]: DEBUG oslo_concurrency.lockutils [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] Acquiring lock "9214a18f-c22d-4e24-980e-7241a2b993bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 585.646773] env[60024]: DEBUG oslo_concurrency.lockutils [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] Lock "9214a18f-c22d-4e24-980e-7241a2b993bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 585.646773] env[60024]: DEBUG oslo_concurrency.lockutils [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] Lock "9214a18f-c22d-4e24-980e-7241a2b993bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 585.646773] env[60024]: DEBUG nova.compute.manager [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] No waiting events found dispatching network-vif-plugged-a863c9bd-8117-454c-988f-693c41aea068 {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 585.647067] env[60024]: WARNING nova.compute.manager [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Received unexpected event network-vif-plugged-a863c9bd-8117-454c-988f-693c41aea068 for instance with vm_state building and task_state spawning. [ 585.647127] env[60024]: DEBUG nova.compute.manager [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Received event network-changed-a863c9bd-8117-454c-988f-693c41aea068 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 585.647240] env[60024]: DEBUG nova.compute.manager [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Refreshing instance network info cache due to event network-changed-a863c9bd-8117-454c-988f-693c41aea068. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 585.647456] env[60024]: DEBUG oslo_concurrency.lockutils [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] Acquiring lock "refresh_cache-9214a18f-c22d-4e24-980e-7241a2b993bd" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 585.647564] env[60024]: DEBUG oslo_concurrency.lockutils [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] Acquired lock "refresh_cache-9214a18f-c22d-4e24-980e-7241a2b993bd" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 585.647725] env[60024]: DEBUG nova.network.neutron [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Refreshing network info cache for port a863c9bd-8117-454c-988f-693c41aea068 {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 585.887151] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576226, 'name': CreateVM_Task, 'duration_secs': 0.440419} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 585.887151] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 585.887424] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 585.887727] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 585.888300] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 585.888791] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5335e32e-96ea-4726-b4b8-56ee61c7d1a7 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.897274] env[60024]: DEBUG oslo_vmware.api [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Waiting for the task: (returnval){ [ 585.897274] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]52f67973-2380-a602-afba-9c4d6a0bea90" [ 585.897274] env[60024]: _type = "Task" [ 585.897274] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 585.914330] env[60024]: DEBUG oslo_vmware.api [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]52f67973-2380-a602-afba-9c4d6a0bea90, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 586.298457] env[60024]: DEBUG nova.network.neutron [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Successfully updated port: 3b332be1-bb9d-4c3a-8158-7aa801f480ae {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 586.310567] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquiring lock "refresh_cache-3ab1b905-cd6f-4d2b-a244-f85e56f796d3" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 586.310567] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquired lock "refresh_cache-3ab1b905-cd6f-4d2b-a244-f85e56f796d3" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 586.310567] env[60024]: DEBUG nova.network.neutron [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 586.408762] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 586.409215] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 586.409608] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 586.564744] env[60024]: DEBUG nova.network.neutron [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Updated VIF entry in instance network info cache for port a863c9bd-8117-454c-988f-693c41aea068. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 586.565115] env[60024]: DEBUG nova.network.neutron [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Updating instance_info_cache with network_info: [{"id": "a863c9bd-8117-454c-988f-693c41aea068", "address": "fa:16:3e:bc:e0:68", "network": {"id": "4867f11b-b7b6-404d-a2a0-faa235d1d483", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationNegativeTestJSON-335371632-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dc17a2829e0b4b7298ae746628595053", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c8ee8640-3787-4c27-9581-962ddb2be7e5", "external-id": "nsx-vlan-transportzone-224", "segmentation_id": 224, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa863c9bd-81", "ovs_interfaceid": "a863c9bd-8117-454c-988f-693c41aea068", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 586.576193] env[60024]: DEBUG oslo_concurrency.lockutils [req-6f23e1d1-87a2-4d20-92e5-5bcf1100bdbd req-3b94ccb2-d631-4e81-8c87-a50af9e08661 service nova] Releasing lock "refresh_cache-9214a18f-c22d-4e24-980e-7241a2b993bd" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 586.628269] env[60024]: DEBUG nova.network.neutron [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Updated VIF entry in instance network info cache for port 887beb35-0ce2-4f8e-9d80-36cfc959c052. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 586.628753] env[60024]: DEBUG nova.network.neutron [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Updating instance_info_cache with network_info: [{"id": "887beb35-0ce2-4f8e-9d80-36cfc959c052", "address": "fa:16:3e:5e:d3:6b", "network": {"id": "0f897b67-cdb2-43dd-91fe-7f79bc7b3d46", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67423902", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f500e2d8bd8b4db28dc4c1f088d12990", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "983826cf-6390-4ec6-bf97-30a1060947fc", "external-id": "nsx-vlan-transportzone-367", "segmentation_id": 367, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap887beb35-0c", "ovs_interfaceid": "887beb35-0ce2-4f8e-9d80-36cfc959c052", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ad0f2c95-be2b-4375-b296-5d5d3e581516", "address": "fa:16:3e:fc:2e:78", "network": {"id": "71d2a65c-7b37-48fb-a80c-839fc99a8a26", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2111383669", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "f500e2d8bd8b4db28dc4c1f088d12990", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d2bf584a-b4a3-4e7a-b0b7-eb8a2bc5a11d", "external-id": "nsx-vlan-transportzone-286", "segmentation_id": 286, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapad0f2c95-be", "ovs_interfaceid": "ad0f2c95-be2b-4375-b296-5d5d3e581516", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b13af17d-15e6-4b21-b707-671f401eb815", "address": "fa:16:3e:6f:25:83", "network": {"id": "0f897b67-cdb2-43dd-91fe-7f79bc7b3d46", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67423902", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.116", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f500e2d8bd8b4db28dc4c1f088d12990", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "983826cf-6390-4ec6-bf97-30a1060947fc", "external-id": "nsx-vlan-transportzone-367", "segmentation_id": 367, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb13af17d-15", "ovs_interfaceid": "b13af17d-15e6-4b21-b707-671f401eb815", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 586.643152] env[60024]: DEBUG oslo_concurrency.lockutils [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] Releasing lock "refresh_cache-37916d26-1b5e-4991-83a2-ca5a5b00c2ac" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 586.643517] env[60024]: DEBUG nova.compute.manager [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Received event network-vif-plugged-3e6e8d19-b28b-4fe5-bb41-b3cada986c2f {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 586.644037] env[60024]: DEBUG oslo_concurrency.lockutils [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] Acquiring lock "ce222a29-3611-45b3-9664-87ae2fb1b1b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 586.644589] env[60024]: DEBUG oslo_concurrency.lockutils [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] Lock "ce222a29-3611-45b3-9664-87ae2fb1b1b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 586.644915] env[60024]: DEBUG oslo_concurrency.lockutils [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] Lock "ce222a29-3611-45b3-9664-87ae2fb1b1b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 586.645160] env[60024]: DEBUG nova.compute.manager [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] No waiting events found dispatching network-vif-plugged-3e6e8d19-b28b-4fe5-bb41-b3cada986c2f {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 586.645348] env[60024]: WARNING nova.compute.manager [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Received unexpected event network-vif-plugged-3e6e8d19-b28b-4fe5-bb41-b3cada986c2f for instance with vm_state building and task_state spawning. [ 586.645776] env[60024]: DEBUG nova.compute.manager [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Received event network-vif-plugged-ad0f2c95-be2b-4375-b296-5d5d3e581516 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 586.646105] env[60024]: DEBUG oslo_concurrency.lockutils [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] Acquiring lock "37916d26-1b5e-4991-83a2-ca5a5b00c2ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 586.646339] env[60024]: DEBUG oslo_concurrency.lockutils [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] Lock "37916d26-1b5e-4991-83a2-ca5a5b00c2ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 586.646530] env[60024]: DEBUG oslo_concurrency.lockutils [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] Lock "37916d26-1b5e-4991-83a2-ca5a5b00c2ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 586.646743] env[60024]: DEBUG nova.compute.manager [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] No waiting events found dispatching network-vif-plugged-ad0f2c95-be2b-4375-b296-5d5d3e581516 {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 586.647103] env[60024]: WARNING nova.compute.manager [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Received unexpected event network-vif-plugged-ad0f2c95-be2b-4375-b296-5d5d3e581516 for instance with vm_state building and task_state spawning. [ 586.647298] env[60024]: DEBUG nova.compute.manager [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Received event network-changed-3e6e8d19-b28b-4fe5-bb41-b3cada986c2f {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 586.647459] env[60024]: DEBUG nova.compute.manager [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Refreshing instance network info cache due to event network-changed-3e6e8d19-b28b-4fe5-bb41-b3cada986c2f. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 586.647641] env[60024]: DEBUG oslo_concurrency.lockutils [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] Acquiring lock "refresh_cache-ce222a29-3611-45b3-9664-87ae2fb1b1b8" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 586.647778] env[60024]: DEBUG oslo_concurrency.lockutils [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] Acquired lock "refresh_cache-ce222a29-3611-45b3-9664-87ae2fb1b1b8" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 586.647934] env[60024]: DEBUG nova.network.neutron [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Refreshing network info cache for port 3e6e8d19-b28b-4fe5-bb41-b3cada986c2f {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 586.651175] env[60024]: DEBUG nova.network.neutron [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 587.752806] env[60024]: DEBUG nova.network.neutron [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Updating instance_info_cache with network_info: [{"id": "3b332be1-bb9d-4c3a-8158-7aa801f480ae", "address": "fa:16:3e:ce:cc:6b", "network": {"id": "271e6595-cea8-4029-8fab-8f17d8eac2ed", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-118947530-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "730d218125d1484687ab1b68a1e73d2e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c8459aaf-d6a8-46fb-ad14-464ac3104695", "external-id": "nsx-vlan-transportzone-46", "segmentation_id": 46, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3b332be1-bb", "ovs_interfaceid": "3b332be1-bb9d-4c3a-8158-7aa801f480ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 587.764569] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Releasing lock "refresh_cache-3ab1b905-cd6f-4d2b-a244-f85e56f796d3" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 587.765215] env[60024]: DEBUG nova.compute.manager [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Instance network_info: |[{"id": "3b332be1-bb9d-4c3a-8158-7aa801f480ae", "address": "fa:16:3e:ce:cc:6b", "network": {"id": "271e6595-cea8-4029-8fab-8f17d8eac2ed", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-118947530-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "730d218125d1484687ab1b68a1e73d2e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c8459aaf-d6a8-46fb-ad14-464ac3104695", "external-id": "nsx-vlan-transportzone-46", "segmentation_id": 46, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3b332be1-bb", "ovs_interfaceid": "3b332be1-bb9d-4c3a-8158-7aa801f480ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 587.766101] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ce:cc:6b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c8459aaf-d6a8-46fb-ad14-464ac3104695', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3b332be1-bb9d-4c3a-8158-7aa801f480ae', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 587.774949] env[60024]: DEBUG oslo.service.loopingcall [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 587.775659] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 587.779016] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-944c3e20-43b9-4c1d-ad06-2b70ffa65855 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.797554] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 587.797554] env[60024]: value = "task-4576227" [ 587.797554] env[60024]: _type = "Task" [ 587.797554] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 587.815996] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576227, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 587.949975] env[60024]: DEBUG nova.network.neutron [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Updated VIF entry in instance network info cache for port 3e6e8d19-b28b-4fe5-bb41-b3cada986c2f. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 587.949975] env[60024]: DEBUG nova.network.neutron [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Updating instance_info_cache with network_info: [{"id": "3e6e8d19-b28b-4fe5-bb41-b3cada986c2f", "address": "fa:16:3e:68:5a:7f", "network": {"id": "271e6595-cea8-4029-8fab-8f17d8eac2ed", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-118947530-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "730d218125d1484687ab1b68a1e73d2e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c8459aaf-d6a8-46fb-ad14-464ac3104695", "external-id": "nsx-vlan-transportzone-46", "segmentation_id": 46, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3e6e8d19-b2", "ovs_interfaceid": "3e6e8d19-b28b-4fe5-bb41-b3cada986c2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 587.968043] env[60024]: DEBUG oslo_concurrency.lockutils [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] Releasing lock "refresh_cache-ce222a29-3611-45b3-9664-87ae2fb1b1b8" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 587.968043] env[60024]: DEBUG nova.compute.manager [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Received event network-changed-ad0f2c95-be2b-4375-b296-5d5d3e581516 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 587.968043] env[60024]: DEBUG nova.compute.manager [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Refreshing instance network info cache due to event network-changed-ad0f2c95-be2b-4375-b296-5d5d3e581516. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 587.968043] env[60024]: DEBUG oslo_concurrency.lockutils [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] Acquiring lock "refresh_cache-37916d26-1b5e-4991-83a2-ca5a5b00c2ac" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 587.968043] env[60024]: DEBUG oslo_concurrency.lockutils [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] Acquired lock "refresh_cache-37916d26-1b5e-4991-83a2-ca5a5b00c2ac" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 587.968043] env[60024]: DEBUG nova.network.neutron [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Refreshing network info cache for port ad0f2c95-be2b-4375-b296-5d5d3e581516 {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 588.314749] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576227, 'name': CreateVM_Task, 'duration_secs': 0.348719} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 588.314971] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 588.315717] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 588.319249] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 588.320366] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 588.320366] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ed36cd09-9c3b-4774-94c4-831c4b30b261 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.329284] env[60024]: DEBUG oslo_vmware.api [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Waiting for the task: (returnval){ [ 588.329284] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]529c1850-c21d-9cc6-c370-6b5ed6fb8dc2" [ 588.329284] env[60024]: _type = "Task" [ 588.329284] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 588.340654] env[60024]: DEBUG oslo_vmware.api [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]529c1850-c21d-9cc6-c370-6b5ed6fb8dc2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 588.531204] env[60024]: DEBUG nova.network.neutron [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Updated VIF entry in instance network info cache for port ad0f2c95-be2b-4375-b296-5d5d3e581516. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 588.531683] env[60024]: DEBUG nova.network.neutron [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Updating instance_info_cache with network_info: [{"id": "887beb35-0ce2-4f8e-9d80-36cfc959c052", "address": "fa:16:3e:5e:d3:6b", "network": {"id": "0f897b67-cdb2-43dd-91fe-7f79bc7b3d46", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67423902", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f500e2d8bd8b4db28dc4c1f088d12990", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "983826cf-6390-4ec6-bf97-30a1060947fc", "external-id": "nsx-vlan-transportzone-367", "segmentation_id": 367, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap887beb35-0c", "ovs_interfaceid": "887beb35-0ce2-4f8e-9d80-36cfc959c052", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ad0f2c95-be2b-4375-b296-5d5d3e581516", "address": "fa:16:3e:fc:2e:78", "network": {"id": "71d2a65c-7b37-48fb-a80c-839fc99a8a26", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2111383669", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "f500e2d8bd8b4db28dc4c1f088d12990", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d2bf584a-b4a3-4e7a-b0b7-eb8a2bc5a11d", "external-id": "nsx-vlan-transportzone-286", "segmentation_id": 286, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapad0f2c95-be", "ovs_interfaceid": "ad0f2c95-be2b-4375-b296-5d5d3e581516", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b13af17d-15e6-4b21-b707-671f401eb815", "address": "fa:16:3e:6f:25:83", "network": {"id": "0f897b67-cdb2-43dd-91fe-7f79bc7b3d46", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67423902", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.116", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f500e2d8bd8b4db28dc4c1f088d12990", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "983826cf-6390-4ec6-bf97-30a1060947fc", "external-id": "nsx-vlan-transportzone-367", "segmentation_id": 367, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb13af17d-15", "ovs_interfaceid": "b13af17d-15e6-4b21-b707-671f401eb815", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 588.547782] env[60024]: DEBUG oslo_concurrency.lockutils [req-0da9fdf2-8764-4662-8b4c-753441f94a72 req-1ace2f8d-e3cf-4576-a097-512f7c7a7978 service nova] Releasing lock "refresh_cache-37916d26-1b5e-4991-83a2-ca5a5b00c2ac" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 588.630322] env[60024]: DEBUG nova.compute.manager [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Received event network-vif-plugged-b13af17d-15e6-4b21-b707-671f401eb815 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 588.630583] env[60024]: DEBUG oslo_concurrency.lockutils [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] Acquiring lock "37916d26-1b5e-4991-83a2-ca5a5b00c2ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.630859] env[60024]: DEBUG oslo_concurrency.lockutils [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] Lock "37916d26-1b5e-4991-83a2-ca5a5b00c2ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.631085] env[60024]: DEBUG oslo_concurrency.lockutils [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] Lock "37916d26-1b5e-4991-83a2-ca5a5b00c2ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.631315] env[60024]: DEBUG nova.compute.manager [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] No waiting events found dispatching network-vif-plugged-b13af17d-15e6-4b21-b707-671f401eb815 {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 588.631513] env[60024]: WARNING nova.compute.manager [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Received unexpected event network-vif-plugged-b13af17d-15e6-4b21-b707-671f401eb815 for instance with vm_state building and task_state spawning. [ 588.636075] env[60024]: DEBUG nova.compute.manager [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Received event network-changed-b13af17d-15e6-4b21-b707-671f401eb815 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 588.636075] env[60024]: DEBUG nova.compute.manager [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Refreshing instance network info cache due to event network-changed-b13af17d-15e6-4b21-b707-671f401eb815. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 588.636075] env[60024]: DEBUG oslo_concurrency.lockutils [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] Acquiring lock "refresh_cache-37916d26-1b5e-4991-83a2-ca5a5b00c2ac" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 588.636075] env[60024]: DEBUG oslo_concurrency.lockutils [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] Acquired lock "refresh_cache-37916d26-1b5e-4991-83a2-ca5a5b00c2ac" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 588.636075] env[60024]: DEBUG nova.network.neutron [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Refreshing network info cache for port b13af17d-15e6-4b21-b707-671f401eb815 {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 588.846731] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 588.847034] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 588.847345] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 589.342152] env[60024]: DEBUG nova.network.neutron [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Updated VIF entry in instance network info cache for port b13af17d-15e6-4b21-b707-671f401eb815. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 589.342645] env[60024]: DEBUG nova.network.neutron [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Updating instance_info_cache with network_info: [{"id": "887beb35-0ce2-4f8e-9d80-36cfc959c052", "address": "fa:16:3e:5e:d3:6b", "network": {"id": "0f897b67-cdb2-43dd-91fe-7f79bc7b3d46", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67423902", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.60", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f500e2d8bd8b4db28dc4c1f088d12990", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "983826cf-6390-4ec6-bf97-30a1060947fc", "external-id": "nsx-vlan-transportzone-367", "segmentation_id": 367, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap887beb35-0c", "ovs_interfaceid": "887beb35-0ce2-4f8e-9d80-36cfc959c052", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ad0f2c95-be2b-4375-b296-5d5d3e581516", "address": "fa:16:3e:fc:2e:78", "network": {"id": "71d2a65c-7b37-48fb-a80c-839fc99a8a26", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2111383669", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.224", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "f500e2d8bd8b4db28dc4c1f088d12990", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d2bf584a-b4a3-4e7a-b0b7-eb8a2bc5a11d", "external-id": "nsx-vlan-transportzone-286", "segmentation_id": 286, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapad0f2c95-be", "ovs_interfaceid": "ad0f2c95-be2b-4375-b296-5d5d3e581516", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b13af17d-15e6-4b21-b707-671f401eb815", "address": "fa:16:3e:6f:25:83", "network": {"id": "0f897b67-cdb2-43dd-91fe-7f79bc7b3d46", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-67423902", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.116", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f500e2d8bd8b4db28dc4c1f088d12990", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "983826cf-6390-4ec6-bf97-30a1060947fc", "external-id": "nsx-vlan-transportzone-367", "segmentation_id": 367, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb13af17d-15", "ovs_interfaceid": "b13af17d-15e6-4b21-b707-671f401eb815", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 589.360963] env[60024]: DEBUG oslo_concurrency.lockutils [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] Releasing lock "refresh_cache-37916d26-1b5e-4991-83a2-ca5a5b00c2ac" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 589.360963] env[60024]: DEBUG nova.compute.manager [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Received event network-vif-plugged-3b332be1-bb9d-4c3a-8158-7aa801f480ae {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 589.360963] env[60024]: DEBUG oslo_concurrency.lockutils [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] Acquiring lock "3ab1b905-cd6f-4d2b-a244-f85e56f796d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 589.360963] env[60024]: DEBUG oslo_concurrency.lockutils [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] Lock "3ab1b905-cd6f-4d2b-a244-f85e56f796d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 589.360963] env[60024]: DEBUG oslo_concurrency.lockutils [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] Lock "3ab1b905-cd6f-4d2b-a244-f85e56f796d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 589.360963] env[60024]: DEBUG nova.compute.manager [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] No waiting events found dispatching network-vif-plugged-3b332be1-bb9d-4c3a-8158-7aa801f480ae {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 589.360963] env[60024]: WARNING nova.compute.manager [req-843bc6b6-2432-4b43-8e59-59a495a8e600 req-ca6b65d5-a6b8-40e4-8c17-67c410862d33 service nova] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Received unexpected event network-vif-plugged-3b332be1-bb9d-4c3a-8158-7aa801f480ae for instance with vm_state building and task_state spawning. [ 592.448993] env[60024]: DEBUG nova.compute.manager [req-01878a2c-3ae1-43b8-8f29-da4366066fe6 req-7b09955b-8569-4dcf-abef-7aecbdb5d987 service nova] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Received event network-changed-3b332be1-bb9d-4c3a-8158-7aa801f480ae {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 592.448993] env[60024]: DEBUG nova.compute.manager [req-01878a2c-3ae1-43b8-8f29-da4366066fe6 req-7b09955b-8569-4dcf-abef-7aecbdb5d987 service nova] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Refreshing instance network info cache due to event network-changed-3b332be1-bb9d-4c3a-8158-7aa801f480ae. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 592.449666] env[60024]: DEBUG oslo_concurrency.lockutils [req-01878a2c-3ae1-43b8-8f29-da4366066fe6 req-7b09955b-8569-4dcf-abef-7aecbdb5d987 service nova] Acquiring lock "refresh_cache-3ab1b905-cd6f-4d2b-a244-f85e56f796d3" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 592.449666] env[60024]: DEBUG oslo_concurrency.lockutils [req-01878a2c-3ae1-43b8-8f29-da4366066fe6 req-7b09955b-8569-4dcf-abef-7aecbdb5d987 service nova] Acquired lock "refresh_cache-3ab1b905-cd6f-4d2b-a244-f85e56f796d3" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 592.449666] env[60024]: DEBUG nova.network.neutron [req-01878a2c-3ae1-43b8-8f29-da4366066fe6 req-7b09955b-8569-4dcf-abef-7aecbdb5d987 service nova] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Refreshing network info cache for port 3b332be1-bb9d-4c3a-8158-7aa801f480ae {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 592.836728] env[60024]: DEBUG nova.network.neutron [req-01878a2c-3ae1-43b8-8f29-da4366066fe6 req-7b09955b-8569-4dcf-abef-7aecbdb5d987 service nova] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Updated VIF entry in instance network info cache for port 3b332be1-bb9d-4c3a-8158-7aa801f480ae. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 592.837162] env[60024]: DEBUG nova.network.neutron [req-01878a2c-3ae1-43b8-8f29-da4366066fe6 req-7b09955b-8569-4dcf-abef-7aecbdb5d987 service nova] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Updating instance_info_cache with network_info: [{"id": "3b332be1-bb9d-4c3a-8158-7aa801f480ae", "address": "fa:16:3e:ce:cc:6b", "network": {"id": "271e6595-cea8-4029-8fab-8f17d8eac2ed", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-118947530-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "730d218125d1484687ab1b68a1e73d2e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c8459aaf-d6a8-46fb-ad14-464ac3104695", "external-id": "nsx-vlan-transportzone-46", "segmentation_id": 46, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3b332be1-bb", "ovs_interfaceid": "3b332be1-bb9d-4c3a-8158-7aa801f480ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 592.851489] env[60024]: DEBUG oslo_concurrency.lockutils [req-01878a2c-3ae1-43b8-8f29-da4366066fe6 req-7b09955b-8569-4dcf-abef-7aecbdb5d987 service nova] Releasing lock "refresh_cache-3ab1b905-cd6f-4d2b-a244-f85e56f796d3" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 606.233966] env[60024]: WARNING oslo_vmware.rw_handles [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 606.233966] env[60024]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 606.233966] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 606.233966] env[60024]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 606.233966] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 606.233966] env[60024]: ERROR oslo_vmware.rw_handles response.begin() [ 606.233966] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 606.233966] env[60024]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 606.233966] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 606.233966] env[60024]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 606.233966] env[60024]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 606.233966] env[60024]: ERROR oslo_vmware.rw_handles [ 606.234661] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Downloaded image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to vmware_temp/4aa2014d-9089-4073-95bc-66da8f5f98e6/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 606.235779] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Caching image {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 606.236033] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Copying Virtual Disk [datastore2] vmware_temp/4aa2014d-9089-4073-95bc-66da8f5f98e6/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk to [datastore2] vmware_temp/4aa2014d-9089-4073-95bc-66da8f5f98e6/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk {{(pid=60024) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 606.236384] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-69aa609e-6a37-4409-b9fd-14b7b03accb4 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.247862] env[60024]: DEBUG oslo_vmware.api [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Waiting for the task: (returnval){ [ 606.247862] env[60024]: value = "task-4576228" [ 606.247862] env[60024]: _type = "Task" [ 606.247862] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 606.764560] env[60024]: DEBUG oslo_vmware.exceptions [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Fault InvalidArgument not matched. {{(pid=60024) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 606.764847] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 606.770741] env[60024]: ERROR nova.compute.manager [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 606.770741] env[60024]: Faults: ['InvalidArgument'] [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Traceback (most recent call last): [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] yield resources [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] self.driver.spawn(context, instance, image_meta, [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] self._fetch_image_if_missing(context, vi) [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] image_cache(vi, tmp_image_ds_loc) [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] vm_util.copy_virtual_disk( [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] session._wait_for_task(vmdk_copy_task) [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] return self.wait_for_task(task_ref) [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] return evt.wait() [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] result = hub.switch() [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] return self.greenlet.switch() [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] self.f(*self.args, **self.kw) [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] raise exceptions.translate_fault(task_info.error) [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Faults: ['InvalidArgument'] [ 606.770741] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] [ 606.777234] env[60024]: INFO nova.compute.manager [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Terminating instance [ 606.777234] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 606.777234] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 606.779792] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Acquiring lock "refresh_cache-15e44d1f-ae9b-4ff7-841c-90acc81cf38b" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 606.779792] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Acquired lock "refresh_cache-15e44d1f-ae9b-4ff7-841c-90acc81cf38b" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 606.779792] env[60024]: DEBUG nova.network.neutron [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 606.781491] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1e92ba67-7e9f-4a5c-8729-cf7d84ab97e0 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.797552] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 606.797759] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 606.807570] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6ed1c58b-3bc8-49bb-bc9d-1e61b40f0a5a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.820043] env[60024]: DEBUG oslo_vmware.api [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Waiting for the task: (returnval){ [ 606.820043] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]5280cf6e-5d9a-b4b6-e5e9-0f89a5e31e95" [ 606.820043] env[60024]: _type = "Task" [ 606.820043] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 606.839073] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 606.839310] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Creating directory with path [datastore2] vmware_temp/3612249c-337b-4cdd-a5fb-e91277991d67/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 606.839563] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b11c4333-6a6b-4d04-999d-6ef91b97508f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.868441] env[60024]: DEBUG nova.network.neutron [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 606.878945] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Created directory with path [datastore2] vmware_temp/3612249c-337b-4cdd-a5fb-e91277991d67/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 606.879209] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Fetch image to [datastore2] vmware_temp/3612249c-337b-4cdd-a5fb-e91277991d67/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 606.879407] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/3612249c-337b-4cdd-a5fb-e91277991d67/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 606.880676] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96a97d3d-b02e-4e49-afc9-c51d6ca37f82 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.891636] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-700c178b-a385-4687-b643-0a8e41c27982 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.905509] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cae9e5fd-f001-441e-841c-2a6395cea261 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.951631] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14101b33-04f4-4e4f-ab29-61448f866e46 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.959758] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ac626f26-7d9b-41bf-a7bb-18abd187c984 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.997038] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 607.084839] env[60024]: DEBUG oslo_vmware.rw_handles [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3612249c-337b-4cdd-a5fb-e91277991d67/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 607.155108] env[60024]: DEBUG oslo_vmware.rw_handles [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Completed reading data from the image iterator. {{(pid=60024) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 607.155223] env[60024]: DEBUG oslo_vmware.rw_handles [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3612249c-337b-4cdd-a5fb-e91277991d67/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 607.280999] env[60024]: DEBUG nova.network.neutron [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 607.292015] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Releasing lock "refresh_cache-15e44d1f-ae9b-4ff7-841c-90acc81cf38b" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 607.293105] env[60024]: DEBUG nova.compute.manager [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 607.293105] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 607.295124] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fd7973f-25ff-4f2f-80af-53221ad64a8a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.305453] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 607.305453] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c37612b4-e049-4c04-8d9d-6984248f3d96 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.340739] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 607.340739] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 607.340739] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Deleting the datastore file [datastore2] 15e44d1f-ae9b-4ff7-841c-90acc81cf38b {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 607.340739] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-86d013d3-31ae-4367-ae32-9d502f4b838d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.354100] env[60024]: DEBUG oslo_vmware.api [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Waiting for the task: (returnval){ [ 607.354100] env[60024]: value = "task-4576230" [ 607.354100] env[60024]: _type = "Task" [ 607.354100] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 607.367758] env[60024]: DEBUG oslo_vmware.api [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Task: {'id': task-4576230, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 607.867772] env[60024]: DEBUG oslo_vmware.api [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Task: {'id': task-4576230, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.050032} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 607.868049] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 607.868232] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 607.868403] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 607.868812] env[60024]: INFO nova.compute.manager [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Took 0.58 seconds to destroy the instance on the hypervisor. [ 607.869103] env[60024]: DEBUG oslo.service.loopingcall [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 607.869305] env[60024]: DEBUG nova.compute.manager [-] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Skipping network deallocation for instance since networking was not requested. {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 607.874241] env[60024]: DEBUG nova.compute.claims [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 607.874241] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 607.874241] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.142888] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-464f4306-a326-420e-8d79-8a9548726d50 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.156680] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc142e47-dcb6-4150-a946-d7841a203b6b {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.198585] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f0e94b3-0222-4dda-90c0-5c078a963aa8 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.211169] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9cbfb7e-1044-4a93-92bc-59acb9d5c2eb {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.231094] env[60024]: DEBUG nova.compute.provider_tree [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 608.243234] env[60024]: DEBUG nova.scheduler.client.report [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 608.258651] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.385s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.259243] env[60024]: ERROR nova.compute.manager [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 608.259243] env[60024]: Faults: ['InvalidArgument'] [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Traceback (most recent call last): [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] self.driver.spawn(context, instance, image_meta, [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] self._fetch_image_if_missing(context, vi) [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] image_cache(vi, tmp_image_ds_loc) [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] vm_util.copy_virtual_disk( [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] session._wait_for_task(vmdk_copy_task) [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] return self.wait_for_task(task_ref) [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] return evt.wait() [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] result = hub.switch() [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] return self.greenlet.switch() [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] self.f(*self.args, **self.kw) [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] raise exceptions.translate_fault(task_info.error) [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Faults: ['InvalidArgument'] [ 608.259243] env[60024]: ERROR nova.compute.manager [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] [ 608.260175] env[60024]: DEBUG nova.compute.utils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] VimFaultException {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 608.262966] env[60024]: DEBUG nova.compute.manager [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Build of instance 15e44d1f-ae9b-4ff7-841c-90acc81cf38b was re-scheduled: A specified parameter was not correct: fileType [ 608.262966] env[60024]: Faults: ['InvalidArgument'] {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 608.263462] env[60024]: DEBUG nova.compute.manager [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 608.263738] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Acquiring lock "refresh_cache-15e44d1f-ae9b-4ff7-841c-90acc81cf38b" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 608.263918] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Acquired lock "refresh_cache-15e44d1f-ae9b-4ff7-841c-90acc81cf38b" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 608.264144] env[60024]: DEBUG nova.network.neutron [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 608.346443] env[60024]: DEBUG nova.network.neutron [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 608.464285] env[60024]: DEBUG nova.network.neutron [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 608.476432] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Releasing lock "refresh_cache-15e44d1f-ae9b-4ff7-841c-90acc81cf38b" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 608.476665] env[60024]: DEBUG nova.compute.manager [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 608.476850] env[60024]: DEBUG nova.compute.manager [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] [instance: 15e44d1f-ae9b-4ff7-841c-90acc81cf38b] Skipping network deallocation for instance since networking was not requested. {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 608.614742] env[60024]: INFO nova.scheduler.client.report [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Deleted allocations for instance 15e44d1f-ae9b-4ff7-841c-90acc81cf38b [ 608.641860] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6dd9723a-6ac3-4c93-9836-da2fae0413c3 tempest-ServerDiagnosticsV248Test-927759409 tempest-ServerDiagnosticsV248Test-927759409-project-member] Lock "15e44d1f-ae9b-4ff7-841c-90acc81cf38b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 54.039s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.685947] env[60024]: DEBUG nova.compute.manager [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 608.770700] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 608.771097] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.772821] env[60024]: INFO nova.compute.claims [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 609.048231] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f923478-6101-4137-a134-844a39c1ff97 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.058538] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d09a51f-e752-405f-a601-30752274657e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.096207] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7578414-e8e9-4993-a939-5a7712d45452 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.106975] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cdde9da-084a-4c23-b131-a3e21ba8150a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.122405] env[60024]: DEBUG nova.compute.provider_tree [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 609.131808] env[60024]: DEBUG nova.scheduler.client.report [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 609.152354] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.381s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.152842] env[60024]: DEBUG nova.compute.manager [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 609.204457] env[60024]: DEBUG nova.compute.utils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 609.205757] env[60024]: DEBUG nova.compute.manager [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 609.205924] env[60024]: DEBUG nova.network.neutron [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 609.222057] env[60024]: DEBUG nova.compute.manager [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 609.304949] env[60024]: DEBUG nova.compute.manager [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 609.329706] env[60024]: DEBUG nova.virt.hardware [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 609.330013] env[60024]: DEBUG nova.virt.hardware [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 609.330272] env[60024]: DEBUG nova.virt.hardware [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 609.330510] env[60024]: DEBUG nova.virt.hardware [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 609.330672] env[60024]: DEBUG nova.virt.hardware [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 609.330949] env[60024]: DEBUG nova.virt.hardware [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 609.331081] env[60024]: DEBUG nova.virt.hardware [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 609.331259] env[60024]: DEBUG nova.virt.hardware [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 609.331431] env[60024]: DEBUG nova.virt.hardware [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 609.331601] env[60024]: DEBUG nova.virt.hardware [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 609.331775] env[60024]: DEBUG nova.virt.hardware [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 609.332686] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3b0a25f-cac3-4641-a077-39f1dd057d5d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.343471] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfcabad0-164a-4f1c-874f-8b81d8e0729c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.599851] env[60024]: DEBUG nova.policy [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aee8538ca1c447babd57e5ee96214faf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6bbeaa0167e147da8a6002a03f3ed43d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 611.265460] env[60024]: DEBUG nova.network.neutron [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Successfully created port: 76671be8-5210-460f-bbee-4b6e5f9d6221 {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 614.113231] env[60024]: DEBUG nova.network.neutron [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Successfully updated port: 76671be8-5210-460f-bbee-4b6e5f9d6221 {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 614.126355] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Acquiring lock "refresh_cache-036d6de2-f69b-4714-b89e-9c4307253675" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 614.126355] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Acquired lock "refresh_cache-036d6de2-f69b-4714-b89e-9c4307253675" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 614.126355] env[60024]: DEBUG nova.network.neutron [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 614.494553] env[60024]: DEBUG nova.network.neutron [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 615.141403] env[60024]: DEBUG nova.network.neutron [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Updating instance_info_cache with network_info: [{"id": "76671be8-5210-460f-bbee-4b6e5f9d6221", "address": "fa:16:3e:b9:41:9c", "network": {"id": "065508a2-33cf-4962-837b-61c5c7fb491f", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-63838272-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6bbeaa0167e147da8a6002a03f3ed43d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a55f45a-d631-4ebc-b73b-8a30bd0a32a8", "external-id": "nsx-vlan-transportzone-303", "segmentation_id": 303, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap76671be8-52", "ovs_interfaceid": "76671be8-5210-460f-bbee-4b6e5f9d6221", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 615.161485] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Releasing lock "refresh_cache-036d6de2-f69b-4714-b89e-9c4307253675" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 615.161786] env[60024]: DEBUG nova.compute.manager [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Instance network_info: |[{"id": "76671be8-5210-460f-bbee-4b6e5f9d6221", "address": "fa:16:3e:b9:41:9c", "network": {"id": "065508a2-33cf-4962-837b-61c5c7fb491f", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-63838272-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6bbeaa0167e147da8a6002a03f3ed43d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a55f45a-d631-4ebc-b73b-8a30bd0a32a8", "external-id": "nsx-vlan-transportzone-303", "segmentation_id": 303, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap76671be8-52", "ovs_interfaceid": "76671be8-5210-460f-bbee-4b6e5f9d6221", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 615.162222] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b9:41:9c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1a55f45a-d631-4ebc-b73b-8a30bd0a32a8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '76671be8-5210-460f-bbee-4b6e5f9d6221', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 615.169945] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Creating folder: Project (6bbeaa0167e147da8a6002a03f3ed43d). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 615.170598] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a62eafc4-6a7d-4d0d-b6d1-09e2319c4678 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.191909] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Created folder: Project (6bbeaa0167e147da8a6002a03f3ed43d) in parent group-v894073. [ 615.191909] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Creating folder: Instances. Parent ref: group-v894102. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 615.191909] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6480b89e-9c14-4572-8740-24d8e3539d19 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.206907] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Created folder: Instances in parent group-v894102. [ 615.210024] env[60024]: DEBUG oslo.service.loopingcall [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 615.210024] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 615.210024] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ad64cb22-c1ac-4856-8d99-002e94e13f03 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.233823] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 615.233823] env[60024]: value = "task-4576233" [ 615.233823] env[60024]: _type = "Task" [ 615.233823] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 615.244761] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576233, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 615.749799] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576233, 'name': CreateVM_Task, 'duration_secs': 0.373799} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 615.749799] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 615.750249] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 615.750418] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 615.750748] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 615.751038] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3389d470-aa6f-4f10-820f-ade09f7ec37f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.759838] env[60024]: DEBUG oslo_vmware.api [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Waiting for the task: (returnval){ [ 615.759838] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]52500412-3d32-d7e4-8588-a10b74755a8c" [ 615.759838] env[60024]: _type = "Task" [ 615.759838] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 615.772070] env[60024]: DEBUG oslo_vmware.api [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]52500412-3d32-d7e4-8588-a10b74755a8c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 616.279193] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 616.279193] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 616.279193] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 618.157987] env[60024]: DEBUG nova.compute.manager [req-4032c699-3e71-402a-a6b7-77d96b4cad3a req-fb0316ea-b35a-49f2-98ac-27f8401af1df service nova] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Received event network-vif-plugged-76671be8-5210-460f-bbee-4b6e5f9d6221 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 618.158439] env[60024]: DEBUG oslo_concurrency.lockutils [req-4032c699-3e71-402a-a6b7-77d96b4cad3a req-fb0316ea-b35a-49f2-98ac-27f8401af1df service nova] Acquiring lock "036d6de2-f69b-4714-b89e-9c4307253675-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.158439] env[60024]: DEBUG oslo_concurrency.lockutils [req-4032c699-3e71-402a-a6b7-77d96b4cad3a req-fb0316ea-b35a-49f2-98ac-27f8401af1df service nova] Lock "036d6de2-f69b-4714-b89e-9c4307253675-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.158691] env[60024]: DEBUG oslo_concurrency.lockutils [req-4032c699-3e71-402a-a6b7-77d96b4cad3a req-fb0316ea-b35a-49f2-98ac-27f8401af1df service nova] Lock "036d6de2-f69b-4714-b89e-9c4307253675-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.158910] env[60024]: DEBUG nova.compute.manager [req-4032c699-3e71-402a-a6b7-77d96b4cad3a req-fb0316ea-b35a-49f2-98ac-27f8401af1df service nova] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] No waiting events found dispatching network-vif-plugged-76671be8-5210-460f-bbee-4b6e5f9d6221 {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 618.159092] env[60024]: WARNING nova.compute.manager [req-4032c699-3e71-402a-a6b7-77d96b4cad3a req-fb0316ea-b35a-49f2-98ac-27f8401af1df service nova] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Received unexpected event network-vif-plugged-76671be8-5210-460f-bbee-4b6e5f9d6221 for instance with vm_state building and task_state spawning. [ 622.573824] env[60024]: DEBUG nova.compute.manager [req-c89c1339-df8b-48b1-90b1-fb33d73e97af req-f5a2fe44-19c8-4eef-9111-b6132c01ae9a service nova] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Received event network-changed-76671be8-5210-460f-bbee-4b6e5f9d6221 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 622.574089] env[60024]: DEBUG nova.compute.manager [req-c89c1339-df8b-48b1-90b1-fb33d73e97af req-f5a2fe44-19c8-4eef-9111-b6132c01ae9a service nova] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Refreshing instance network info cache due to event network-changed-76671be8-5210-460f-bbee-4b6e5f9d6221. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 622.574089] env[60024]: DEBUG oslo_concurrency.lockutils [req-c89c1339-df8b-48b1-90b1-fb33d73e97af req-f5a2fe44-19c8-4eef-9111-b6132c01ae9a service nova] Acquiring lock "refresh_cache-036d6de2-f69b-4714-b89e-9c4307253675" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 622.574289] env[60024]: DEBUG oslo_concurrency.lockutils [req-c89c1339-df8b-48b1-90b1-fb33d73e97af req-f5a2fe44-19c8-4eef-9111-b6132c01ae9a service nova] Acquired lock "refresh_cache-036d6de2-f69b-4714-b89e-9c4307253675" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 622.575026] env[60024]: DEBUG nova.network.neutron [req-c89c1339-df8b-48b1-90b1-fb33d73e97af req-f5a2fe44-19c8-4eef-9111-b6132c01ae9a service nova] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Refreshing network info cache for port 76671be8-5210-460f-bbee-4b6e5f9d6221 {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 623.302084] env[60024]: DEBUG nova.network.neutron [req-c89c1339-df8b-48b1-90b1-fb33d73e97af req-f5a2fe44-19c8-4eef-9111-b6132c01ae9a service nova] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Updated VIF entry in instance network info cache for port 76671be8-5210-460f-bbee-4b6e5f9d6221. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 623.302520] env[60024]: DEBUG nova.network.neutron [req-c89c1339-df8b-48b1-90b1-fb33d73e97af req-f5a2fe44-19c8-4eef-9111-b6132c01ae9a service nova] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Updating instance_info_cache with network_info: [{"id": "76671be8-5210-460f-bbee-4b6e5f9d6221", "address": "fa:16:3e:b9:41:9c", "network": {"id": "065508a2-33cf-4962-837b-61c5c7fb491f", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-63838272-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6bbeaa0167e147da8a6002a03f3ed43d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a55f45a-d631-4ebc-b73b-8a30bd0a32a8", "external-id": "nsx-vlan-transportzone-303", "segmentation_id": 303, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap76671be8-52", "ovs_interfaceid": "76671be8-5210-460f-bbee-4b6e5f9d6221", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 623.336130] env[60024]: DEBUG oslo_concurrency.lockutils [req-c89c1339-df8b-48b1-90b1-fb33d73e97af req-f5a2fe44-19c8-4eef-9111-b6132c01ae9a service nova] Releasing lock "refresh_cache-036d6de2-f69b-4714-b89e-9c4307253675" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 635.845358] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 635.873912] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 635.873912] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Starting heal instance info cache {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 635.873912] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Rebuilding the list of instances to heal {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 635.904681] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 635.904821] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 635.904915] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 635.905053] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 635.905190] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 635.905409] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 635.905464] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 635.905595] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 635.905640] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 635.905755] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 635.905873] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Didn't find any instances for network info cache update. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 635.907019] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 635.907019] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 636.341272] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 636.341525] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 636.344446] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 636.344446] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 636.344446] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 636.344446] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60024) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 636.344446] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 636.356387] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 636.356583] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 636.356750] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 636.356910] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60024) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 636.357997] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2252b790-996e-4c35-9bdf-6194d759b144 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.368366] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7257710a-54da-4524-9fd7-66d76adbfbdc {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.385848] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af7c78e8-2d8c-42e9-9826-72c32cd67771 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.393886] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b32ee11-01db-4102-b773-dad1d0180051 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.432449] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180653MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60024) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 636.432681] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 636.432956] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 636.514733] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 8a0d9829-6759-4593-9230-459a546a5908 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 636.514944] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance c4400e80-4457-4a8a-8588-f594e5993cde actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 636.515092] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 68c87b51-b90a-47cc-bec1-05f7c389fc14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 636.515224] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 7a4778b7-5ffc-4641-b968-d0304fd67ee0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 636.515348] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 37916d26-1b5e-4991-83a2-ca5a5b00c2ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 636.515550] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 363f5261-d589-4f99-b7dd-ab8f16cefee3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 636.515611] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance ce222a29-3611-45b3-9664-87ae2fb1b1b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 636.515722] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 9214a18f-c22d-4e24-980e-7241a2b993bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 636.516086] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 3ab1b905-cd6f-4d2b-a244-f85e56f796d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 636.516086] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 036d6de2-f69b-4714-b89e-9c4307253675 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 636.543173] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 5888cc9f-7341-4f9c-a93c-dd5ec95f7369 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 636.543406] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 636.543554] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=100GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 636.724611] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37545a17-9136-4aaf-9fa5-8b2e12c275a2 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.732637] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45e29259-0b83-49b6-ad83-34c2b734efaf {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.766875] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bfb9da4-aeba-4df2-8a30-d33781153adc {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.775598] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13a275f0-66d0-400f-a00d-04265c6b30ef {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 636.789766] env[60024]: DEBUG nova.compute.provider_tree [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 636.803376] env[60024]: DEBUG nova.scheduler.client.report [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 636.819621] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60024) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 636.819821] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.387s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 651.491958] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Acquiring lock "08e2d758-9005-4822-b157-84710b9c5ed4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 651.492367] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Lock "08e2d758-9005-4822-b157-84710b9c5ed4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 653.012711] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquiring lock "54919bf0-b9f3-4bfc-ba1a-c6a52013e351" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 653.013332] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Lock "54919bf0-b9f3-4bfc-ba1a-c6a52013e351" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 653.043153] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquiring lock "a925d5fc-6437-40bb-adf1-ea10c32dde2a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 653.043456] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Lock "a925d5fc-6437-40bb-adf1-ea10c32dde2a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 653.999558] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Acquiring lock "076c3dd5-9043-456d-af24-0d2273321085" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 653.999804] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Lock "076c3dd5-9043-456d-af24-0d2273321085" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.329605] env[60024]: WARNING oslo_vmware.rw_handles [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 655.329605] env[60024]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 655.329605] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 655.329605] env[60024]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 655.329605] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 655.329605] env[60024]: ERROR oslo_vmware.rw_handles response.begin() [ 655.329605] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 655.329605] env[60024]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 655.329605] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 655.329605] env[60024]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 655.329605] env[60024]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 655.329605] env[60024]: ERROR oslo_vmware.rw_handles [ 655.330324] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Downloaded image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to vmware_temp/3612249c-337b-4cdd-a5fb-e91277991d67/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 655.331747] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Caching image {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 655.331850] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Copying Virtual Disk [datastore2] vmware_temp/3612249c-337b-4cdd-a5fb-e91277991d67/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk to [datastore2] vmware_temp/3612249c-337b-4cdd-a5fb-e91277991d67/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk {{(pid=60024) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 655.334802] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d01cafa1-5fcf-4e0d-9866-cbf620f01434 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.345768] env[60024]: DEBUG oslo_vmware.api [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Waiting for the task: (returnval){ [ 655.345768] env[60024]: value = "task-4576234" [ 655.345768] env[60024]: _type = "Task" [ 655.345768] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 655.355270] env[60024]: DEBUG oslo_vmware.api [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Task: {'id': task-4576234, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 655.861499] env[60024]: DEBUG oslo_vmware.exceptions [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Fault InvalidArgument not matched. {{(pid=60024) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 655.862044] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 655.862772] env[60024]: ERROR nova.compute.manager [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 655.862772] env[60024]: Faults: ['InvalidArgument'] [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] Traceback (most recent call last): [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] yield resources [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] self.driver.spawn(context, instance, image_meta, [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] self._vmops.spawn(context, instance, image_meta, injected_files, [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] self._fetch_image_if_missing(context, vi) [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] image_cache(vi, tmp_image_ds_loc) [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] vm_util.copy_virtual_disk( [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] session._wait_for_task(vmdk_copy_task) [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] return self.wait_for_task(task_ref) [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] return evt.wait() [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] result = hub.switch() [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] return self.greenlet.switch() [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] self.f(*self.args, **self.kw) [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] raise exceptions.translate_fault(task_info.error) [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] Faults: ['InvalidArgument'] [ 655.862772] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] [ 655.865837] env[60024]: INFO nova.compute.manager [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Terminating instance [ 655.868053] env[60024]: DEBUG nova.compute.manager [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 655.868053] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 655.868053] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 655.868053] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 655.868926] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88ca2b6c-18f5-4749-8de7-d993c4db1fc5 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.871897] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e0eff614-e204-4ea2-bdb4-66a307e22c8e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.882024] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 655.882024] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-362da5f5-72ec-4d54-93b6-7662c27b79ce {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.885643] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 655.885643] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 655.885643] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8c83f90f-70d7-4ac5-ab94-c3c40768c448 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.893990] env[60024]: DEBUG oslo_vmware.api [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Waiting for the task: (returnval){ [ 655.893990] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]522315d5-07b2-f391-d3e3-032182a8f1cb" [ 655.893990] env[60024]: _type = "Task" [ 655.893990] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 655.908581] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 655.911726] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Creating directory with path [datastore2] vmware_temp/6f9857b6-6e00-43d6-b6f7-9b53ba446a1e/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 655.911726] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-17590c1d-f635-4237-99da-940a0bbde29f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.933032] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Created directory with path [datastore2] vmware_temp/6f9857b6-6e00-43d6-b6f7-9b53ba446a1e/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 655.933032] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Fetch image to [datastore2] vmware_temp/6f9857b6-6e00-43d6-b6f7-9b53ba446a1e/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 655.933032] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/6f9857b6-6e00-43d6-b6f7-9b53ba446a1e/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 655.933032] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc3c0ccd-2f0f-4b53-b410-e4a6d345b0a2 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.942113] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11f85a25-61ed-4ec5-8d5d-567ad799e22f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.954505] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5dad9dc7-d84a-4ea5-95ec-096a193a55b3 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.996302] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba3a5950-25f6-46e0-9d89-4c1be52cead2 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.999488] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 655.999833] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 656.000142] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Deleting the datastore file [datastore2] 8a0d9829-6759-4593-9230-459a546a5908 {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 656.000482] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b17d2d3a-c337-4e5a-8b6d-d19505baff53 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.008822] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a034b4f8-84f6-421b-9b67-2c631256283b {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.011746] env[60024]: DEBUG oslo_vmware.api [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Waiting for the task: (returnval){ [ 656.011746] env[60024]: value = "task-4576236" [ 656.011746] env[60024]: _type = "Task" [ 656.011746] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 656.022794] env[60024]: DEBUG oslo_vmware.api [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Task: {'id': task-4576236, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 656.107200] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 656.165872] env[60024]: DEBUG oslo_vmware.rw_handles [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6f9857b6-6e00-43d6-b6f7-9b53ba446a1e/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 656.225808] env[60024]: DEBUG oslo_vmware.rw_handles [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Completed reading data from the image iterator. {{(pid=60024) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 656.226027] env[60024]: DEBUG oslo_vmware.rw_handles [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6f9857b6-6e00-43d6-b6f7-9b53ba446a1e/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 656.527970] env[60024]: DEBUG oslo_vmware.api [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Task: {'id': task-4576236, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072557} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 656.528288] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 656.528288] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 656.528406] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 656.528733] env[60024]: INFO nova.compute.manager [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Took 0.66 seconds to destroy the instance on the hypervisor. [ 656.531031] env[60024]: DEBUG nova.compute.claims [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 656.531120] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 656.531341] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 656.878595] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a289e84e-0026-4a99-8b59-4fd8333f7a81 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.891126] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40c4d027-e7be-4666-9cde-3cf6f6d93cf1 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.929854] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afe7e46a-979e-48ce-9ab1-270a8311d40d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.938634] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d56d44dd-d55d-4df7-9fd3-3c8ed2e23df3 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.958150] env[60024]: DEBUG nova.compute.provider_tree [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 656.967865] env[60024]: DEBUG nova.scheduler.client.report [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 656.989379] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.458s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 656.989913] env[60024]: ERROR nova.compute.manager [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 656.989913] env[60024]: Faults: ['InvalidArgument'] [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] Traceback (most recent call last): [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] self.driver.spawn(context, instance, image_meta, [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] self._vmops.spawn(context, instance, image_meta, injected_files, [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] self._fetch_image_if_missing(context, vi) [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] image_cache(vi, tmp_image_ds_loc) [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] vm_util.copy_virtual_disk( [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] session._wait_for_task(vmdk_copy_task) [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] return self.wait_for_task(task_ref) [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] return evt.wait() [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] result = hub.switch() [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] return self.greenlet.switch() [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] self.f(*self.args, **self.kw) [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] raise exceptions.translate_fault(task_info.error) [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] Faults: ['InvalidArgument'] [ 656.989913] env[60024]: ERROR nova.compute.manager [instance: 8a0d9829-6759-4593-9230-459a546a5908] [ 656.990910] env[60024]: DEBUG nova.compute.utils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] VimFaultException {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 656.993525] env[60024]: DEBUG nova.compute.manager [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Build of instance 8a0d9829-6759-4593-9230-459a546a5908 was re-scheduled: A specified parameter was not correct: fileType [ 656.993525] env[60024]: Faults: ['InvalidArgument'] {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 656.995931] env[60024]: DEBUG nova.compute.manager [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 656.995931] env[60024]: DEBUG nova.compute.manager [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 656.995931] env[60024]: DEBUG nova.compute.manager [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 656.995931] env[60024]: DEBUG nova.network.neutron [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 658.314219] env[60024]: DEBUG nova.network.neutron [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 658.333926] env[60024]: INFO nova.compute.manager [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: 8a0d9829-6759-4593-9230-459a546a5908] Took 1.34 seconds to deallocate network for instance. [ 658.458038] env[60024]: INFO nova.scheduler.client.report [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Deleted allocations for instance 8a0d9829-6759-4593-9230-459a546a5908 [ 658.477846] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b44b385c-ccd6-4d8e-a3a7-e113eb34c2f3 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Lock "8a0d9829-6759-4593-9230-459a546a5908" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 97.231s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 658.502305] env[60024]: DEBUG nova.compute.manager [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 658.566860] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 658.567141] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 658.568886] env[60024]: INFO nova.compute.claims [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 658.815771] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59244be5-a7b8-4025-b91d-106ff3a7f024 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.829640] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28448669-49bf-4c17-95ed-087045bbb882 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.864674] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dda31cc4-b391-4aaf-bedd-3f1da818a480 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.875142] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a3eef7d-69e7-4fdb-a7ed-b92639c7705b {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.892072] env[60024]: DEBUG nova.compute.provider_tree [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 658.904514] env[60024]: DEBUG nova.scheduler.client.report [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 658.918715] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.351s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 658.919235] env[60024]: DEBUG nova.compute.manager [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 658.978245] env[60024]: DEBUG nova.compute.utils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 658.978245] env[60024]: DEBUG nova.compute.manager [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 658.978454] env[60024]: DEBUG nova.network.neutron [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 658.992741] env[60024]: DEBUG nova.compute.manager [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 659.079413] env[60024]: DEBUG nova.compute.manager [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 659.103535] env[60024]: DEBUG nova.virt.hardware [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 659.103535] env[60024]: DEBUG nova.virt.hardware [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 659.103678] env[60024]: DEBUG nova.virt.hardware [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 659.103920] env[60024]: DEBUG nova.virt.hardware [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 659.104148] env[60024]: DEBUG nova.virt.hardware [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 659.104357] env[60024]: DEBUG nova.virt.hardware [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 659.105446] env[60024]: DEBUG nova.virt.hardware [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 659.105446] env[60024]: DEBUG nova.virt.hardware [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 659.105446] env[60024]: DEBUG nova.virt.hardware [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 659.105446] env[60024]: DEBUG nova.virt.hardware [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 659.105643] env[60024]: DEBUG nova.virt.hardware [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 659.106778] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-111b44d9-2b12-4eb6-b82e-e9201713eb83 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 659.122704] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e9773d0-4eab-4980-b0e3-e1a662d293a6 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 659.239039] env[60024]: DEBUG nova.policy [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f9501d1d780a405a8c0ba4b0e90a054c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '215efdf54af34adc8db180a90116d8bd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 660.613521] env[60024]: DEBUG nova.network.neutron [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Successfully created port: fe1f6aa9-5a13-43be-9a0f-bbca264a9e3d {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 661.540624] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5269542d-1bd0-4a60-91e5-e702adcd1908 tempest-AttachInterfacesTestJSON-62544092 tempest-AttachInterfacesTestJSON-62544092-project-member] Acquiring lock "1eaf8e02-bfb0-4928-9687-cc781a84d16d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 661.542643] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5269542d-1bd0-4a60-91e5-e702adcd1908 tempest-AttachInterfacesTestJSON-62544092 tempest-AttachInterfacesTestJSON-62544092-project-member] Lock "1eaf8e02-bfb0-4928-9687-cc781a84d16d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 661.849615] env[60024]: DEBUG nova.network.neutron [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Successfully updated port: fe1f6aa9-5a13-43be-9a0f-bbca264a9e3d {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 661.860068] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Acquiring lock "refresh_cache-5888cc9f-7341-4f9c-a93c-dd5ec95f7369" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 661.860068] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Acquired lock "refresh_cache-5888cc9f-7341-4f9c-a93c-dd5ec95f7369" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 661.860068] env[60024]: DEBUG nova.network.neutron [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 661.901808] env[60024]: DEBUG nova.network.neutron [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 662.245248] env[60024]: DEBUG nova.network.neutron [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Updating instance_info_cache with network_info: [{"id": "fe1f6aa9-5a13-43be-9a0f-bbca264a9e3d", "address": "fa:16:3e:9f:53:02", "network": {"id": "2dc30c6e-f5d4-41f8-923f-52db38b510e8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.166", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a8236fdd83234f75a229055fe16f088d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3d31a554-a94c-4471-892f-f65aa87b8279", "external-id": "nsx-vlan-transportzone-241", "segmentation_id": 241, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfe1f6aa9-5a", "ovs_interfaceid": "fe1f6aa9-5a13-43be-9a0f-bbca264a9e3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 662.264396] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Releasing lock "refresh_cache-5888cc9f-7341-4f9c-a93c-dd5ec95f7369" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 662.264714] env[60024]: DEBUG nova.compute.manager [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Instance network_info: |[{"id": "fe1f6aa9-5a13-43be-9a0f-bbca264a9e3d", "address": "fa:16:3e:9f:53:02", "network": {"id": "2dc30c6e-f5d4-41f8-923f-52db38b510e8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.166", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a8236fdd83234f75a229055fe16f088d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3d31a554-a94c-4471-892f-f65aa87b8279", "external-id": "nsx-vlan-transportzone-241", "segmentation_id": 241, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfe1f6aa9-5a", "ovs_interfaceid": "fe1f6aa9-5a13-43be-9a0f-bbca264a9e3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 662.265628] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9f:53:02', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3d31a554-a94c-4471-892f-f65aa87b8279', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'fe1f6aa9-5a13-43be-9a0f-bbca264a9e3d', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 662.274472] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Creating folder: Project (215efdf54af34adc8db180a90116d8bd). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 662.275103] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-478fa255-351a-4bcc-890d-64356e562035 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 662.288502] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Created folder: Project (215efdf54af34adc8db180a90116d8bd) in parent group-v894073. [ 662.288781] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Creating folder: Instances. Parent ref: group-v894105. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 662.289569] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f619b187-182c-4258-a359-141ff8b46891 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 662.302189] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Created folder: Instances in parent group-v894105. [ 662.302536] env[60024]: DEBUG oslo.service.loopingcall [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 662.302696] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 662.302897] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3b98eef8-0211-40df-a98a-cd205b03f326 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 662.326344] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 662.326344] env[60024]: value = "task-4576239" [ 662.326344] env[60024]: _type = "Task" [ 662.326344] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 662.340212] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576239, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 662.375059] env[60024]: DEBUG nova.compute.manager [req-d443584e-772a-4014-a842-5a8bb0871258 req-d44aba9b-077d-4cb5-a76d-37a1538275e9 service nova] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Received event network-vif-plugged-fe1f6aa9-5a13-43be-9a0f-bbca264a9e3d {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 662.375059] env[60024]: DEBUG oslo_concurrency.lockutils [req-d443584e-772a-4014-a842-5a8bb0871258 req-d44aba9b-077d-4cb5-a76d-37a1538275e9 service nova] Acquiring lock "5888cc9f-7341-4f9c-a93c-dd5ec95f7369-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 662.375059] env[60024]: DEBUG oslo_concurrency.lockutils [req-d443584e-772a-4014-a842-5a8bb0871258 req-d44aba9b-077d-4cb5-a76d-37a1538275e9 service nova] Lock "5888cc9f-7341-4f9c-a93c-dd5ec95f7369-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 662.375059] env[60024]: DEBUG oslo_concurrency.lockutils [req-d443584e-772a-4014-a842-5a8bb0871258 req-d44aba9b-077d-4cb5-a76d-37a1538275e9 service nova] Lock "5888cc9f-7341-4f9c-a93c-dd5ec95f7369-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 662.375241] env[60024]: DEBUG nova.compute.manager [req-d443584e-772a-4014-a842-5a8bb0871258 req-d44aba9b-077d-4cb5-a76d-37a1538275e9 service nova] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] No waiting events found dispatching network-vif-plugged-fe1f6aa9-5a13-43be-9a0f-bbca264a9e3d {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 662.375241] env[60024]: WARNING nova.compute.manager [req-d443584e-772a-4014-a842-5a8bb0871258 req-d44aba9b-077d-4cb5-a76d-37a1538275e9 service nova] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Received unexpected event network-vif-plugged-fe1f6aa9-5a13-43be-9a0f-bbca264a9e3d for instance with vm_state building and task_state spawning. [ 662.837514] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576239, 'name': CreateVM_Task, 'duration_secs': 0.340631} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 662.837831] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 662.838892] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 662.839592] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 662.840170] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 662.840497] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dc34b7a2-a95a-42c7-a759-86af346aacce {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 662.848025] env[60024]: DEBUG oslo_vmware.api [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Waiting for the task: (returnval){ [ 662.848025] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]52b990ce-4f65-5d40-a924-1ae9fae45aaa" [ 662.848025] env[60024]: _type = "Task" [ 662.848025] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 662.864337] env[60024]: DEBUG oslo_vmware.api [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]52b990ce-4f65-5d40-a924-1ae9fae45aaa, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 663.362922] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 663.363257] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 663.363479] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 663.907247] env[60024]: DEBUG oslo_concurrency.lockutils [None req-14455799-885d-47a6-b31a-d48adadc4279 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Acquiring lock "c841298b-f103-4dc7-8884-efdf2ebc20a6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 663.907510] env[60024]: DEBUG oslo_concurrency.lockutils [None req-14455799-885d-47a6-b31a-d48adadc4279 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Lock "c841298b-f103-4dc7-8884-efdf2ebc20a6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 665.076330] env[60024]: DEBUG nova.compute.manager [req-d9a5b21a-c134-4d63-b232-412ac1dd98fd req-9ec21a99-a641-4a33-afdd-16690c13ba22 service nova] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Received event network-changed-fe1f6aa9-5a13-43be-9a0f-bbca264a9e3d {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 665.076577] env[60024]: DEBUG nova.compute.manager [req-d9a5b21a-c134-4d63-b232-412ac1dd98fd req-9ec21a99-a641-4a33-afdd-16690c13ba22 service nova] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Refreshing instance network info cache due to event network-changed-fe1f6aa9-5a13-43be-9a0f-bbca264a9e3d. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 665.076768] env[60024]: DEBUG oslo_concurrency.lockutils [req-d9a5b21a-c134-4d63-b232-412ac1dd98fd req-9ec21a99-a641-4a33-afdd-16690c13ba22 service nova] Acquiring lock "refresh_cache-5888cc9f-7341-4f9c-a93c-dd5ec95f7369" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 665.076871] env[60024]: DEBUG oslo_concurrency.lockutils [req-d9a5b21a-c134-4d63-b232-412ac1dd98fd req-9ec21a99-a641-4a33-afdd-16690c13ba22 service nova] Acquired lock "refresh_cache-5888cc9f-7341-4f9c-a93c-dd5ec95f7369" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 665.077032] env[60024]: DEBUG nova.network.neutron [req-d9a5b21a-c134-4d63-b232-412ac1dd98fd req-9ec21a99-a641-4a33-afdd-16690c13ba22 service nova] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Refreshing network info cache for port fe1f6aa9-5a13-43be-9a0f-bbca264a9e3d {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 665.787017] env[60024]: DEBUG nova.network.neutron [req-d9a5b21a-c134-4d63-b232-412ac1dd98fd req-9ec21a99-a641-4a33-afdd-16690c13ba22 service nova] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Updated VIF entry in instance network info cache for port fe1f6aa9-5a13-43be-9a0f-bbca264a9e3d. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 665.787017] env[60024]: DEBUG nova.network.neutron [req-d9a5b21a-c134-4d63-b232-412ac1dd98fd req-9ec21a99-a641-4a33-afdd-16690c13ba22 service nova] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Updating instance_info_cache with network_info: [{"id": "fe1f6aa9-5a13-43be-9a0f-bbca264a9e3d", "address": "fa:16:3e:9f:53:02", "network": {"id": "2dc30c6e-f5d4-41f8-923f-52db38b510e8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.166", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a8236fdd83234f75a229055fe16f088d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3d31a554-a94c-4471-892f-f65aa87b8279", "external-id": "nsx-vlan-transportzone-241", "segmentation_id": 241, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfe1f6aa9-5a", "ovs_interfaceid": "fe1f6aa9-5a13-43be-9a0f-bbca264a9e3d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 665.798103] env[60024]: DEBUG oslo_concurrency.lockutils [req-d9a5b21a-c134-4d63-b232-412ac1dd98fd req-9ec21a99-a641-4a33-afdd-16690c13ba22 service nova] Releasing lock "refresh_cache-5888cc9f-7341-4f9c-a93c-dd5ec95f7369" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 666.510776] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2a575c2b-5102-4501-a724-cecb4fb9a882 tempest-ServersTestJSON-1695110591 tempest-ServersTestJSON-1695110591-project-member] Acquiring lock "bf70d23b-4ab5-476e-814c-264b6a9f2455" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 666.511115] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2a575c2b-5102-4501-a724-cecb4fb9a882 tempest-ServersTestJSON-1695110591 tempest-ServersTestJSON-1695110591-project-member] Lock "bf70d23b-4ab5-476e-814c-264b6a9f2455" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 669.415741] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b87ef477-9e92-4961-845a-6acd6bee3a06 tempest-ServersNegativeTestJSON-1972359588 tempest-ServersNegativeTestJSON-1972359588-project-member] Acquiring lock "67ff7d52-6e30-4730-9b5a-9ae32f68b953" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 669.416013] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b87ef477-9e92-4961-845a-6acd6bee3a06 tempest-ServersNegativeTestJSON-1972359588 tempest-ServersNegativeTestJSON-1972359588-project-member] Lock "67ff7d52-6e30-4730-9b5a-9ae32f68b953" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 669.631657] env[60024]: DEBUG oslo_concurrency.lockutils [None req-ba27ce61-be1b-476b-9184-384249a73295 tempest-SecurityGroupsTestJSON-2088684547 tempest-SecurityGroupsTestJSON-2088684547-project-member] Acquiring lock "bd91e947-acae-4dbd-b48b-5a6727eb4cbb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 669.631893] env[60024]: DEBUG oslo_concurrency.lockutils [None req-ba27ce61-be1b-476b-9184-384249a73295 tempest-SecurityGroupsTestJSON-2088684547 tempest-SecurityGroupsTestJSON-2088684547-project-member] Lock "bd91e947-acae-4dbd-b48b-5a6727eb4cbb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 677.441077] env[60024]: DEBUG oslo_concurrency.lockutils [None req-798944be-b318-4a6d-a582-3984731e10fc tempest-ServerActionsTestOtherA-202760684 tempest-ServerActionsTestOtherA-202760684-project-member] Acquiring lock "95358801-c9d8-4582-a712-36a8bf586456" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 677.441420] env[60024]: DEBUG oslo_concurrency.lockutils [None req-798944be-b318-4a6d-a582-3984731e10fc tempest-ServerActionsTestOtherA-202760684 tempest-ServerActionsTestOtherA-202760684-project-member] Lock "95358801-c9d8-4582-a712-36a8bf586456" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 694.486803] env[60024]: DEBUG oslo_concurrency.lockutils [None req-814fad23-a147-41ec-ba07-f0ec4ac3c42f tempest-ServerActionsV293TestJSON-332701173 tempest-ServerActionsV293TestJSON-332701173-project-member] Acquiring lock "be6a4290-dbb3-4e1f-bdd4-0dc106db9435" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.487134] env[60024]: DEBUG oslo_concurrency.lockutils [None req-814fad23-a147-41ec-ba07-f0ec4ac3c42f tempest-ServerActionsV293TestJSON-332701173 tempest-ServerActionsV293TestJSON-332701173-project-member] Lock "be6a4290-dbb3-4e1f-bdd4-0dc106db9435" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.820095] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 696.820440] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 696.820483] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 697.337596] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 697.341305] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 697.341489] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Starting heal instance info cache {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 697.341680] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Rebuilding the list of instances to heal {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 697.362614] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 697.362771] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 697.362900] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 697.363042] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 697.363174] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 697.363353] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 697.363488] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 697.363610] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 697.363886] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 697.364056] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 697.364183] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Didn't find any instances for network info cache update. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 697.364649] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 697.364824] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 698.341681] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 698.341999] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60024) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 698.341999] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 698.354955] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.355203] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.355370] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 698.355525] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60024) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 698.356637] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f94b585c-fcfb-434e-8f41-63d17ca66bf9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.366833] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00ab3f5b-85e2-4f3d-a41d-11ab5997dfdd {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.381049] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e92b4826-bc1a-4342-a2ad-df65d8c33422 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.387731] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89d32f7e-aa27-4faf-86bd-2173d6cbd606 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.417403] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180692MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60024) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 698.417581] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.417746] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.481925] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance c4400e80-4457-4a8a-8588-f594e5993cde actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 698.482102] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 68c87b51-b90a-47cc-bec1-05f7c389fc14 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 698.482236] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 7a4778b7-5ffc-4641-b968-d0304fd67ee0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 698.482360] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 37916d26-1b5e-4991-83a2-ca5a5b00c2ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 698.482482] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 363f5261-d589-4f99-b7dd-ab8f16cefee3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 698.482627] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance ce222a29-3611-45b3-9664-87ae2fb1b1b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 698.482753] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 9214a18f-c22d-4e24-980e-7241a2b993bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 698.482873] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 3ab1b905-cd6f-4d2b-a244-f85e56f796d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 698.482986] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 036d6de2-f69b-4714-b89e-9c4307253675 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 698.483112] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 5888cc9f-7341-4f9c-a93c-dd5ec95f7369 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 698.506418] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 08e2d758-9005-4822-b157-84710b9c5ed4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 698.517538] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 54919bf0-b9f3-4bfc-ba1a-c6a52013e351 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 698.531949] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance a925d5fc-6437-40bb-adf1-ea10c32dde2a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 698.542595] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 076c3dd5-9043-456d-af24-0d2273321085 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 698.552975] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 1eaf8e02-bfb0-4928-9687-cc781a84d16d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 698.562788] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance c841298b-f103-4dc7-8884-efdf2ebc20a6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 698.572274] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance bf70d23b-4ab5-476e-814c-264b6a9f2455 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 698.582032] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 67ff7d52-6e30-4730-9b5a-9ae32f68b953 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 698.591672] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance bd91e947-acae-4dbd-b48b-5a6727eb4cbb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 698.602582] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 95358801-c9d8-4582-a712-36a8bf586456 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 698.612054] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance be6a4290-dbb3-4e1f-bdd4-0dc106db9435 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 698.612307] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 698.612459] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=100GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 698.872987] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27b298fe-db67-4a75-b474-20701ea9013d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.881721] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4eb98e0e-3693-4c67-b558-f04d28177baa {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.911218] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6012f1ee-2695-4848-8396-334b0147a8cc {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.919132] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79af3654-b6cb-41ae-a8f3-69db1c69dd27 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.934172] env[60024]: DEBUG nova.compute.provider_tree [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 698.943314] env[60024]: DEBUG nova.scheduler.client.report [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 698.956729] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60024) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 698.956988] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.539s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 703.412284] env[60024]: WARNING oslo_vmware.rw_handles [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 703.412284] env[60024]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 703.412284] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 703.412284] env[60024]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 703.412284] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 703.412284] env[60024]: ERROR oslo_vmware.rw_handles response.begin() [ 703.412284] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 703.412284] env[60024]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 703.412284] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 703.412284] env[60024]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 703.412284] env[60024]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 703.412284] env[60024]: ERROR oslo_vmware.rw_handles [ 703.412811] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Downloaded image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to vmware_temp/6f9857b6-6e00-43d6-b6f7-9b53ba446a1e/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 703.414464] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Caching image {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 703.414762] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Copying Virtual Disk [datastore2] vmware_temp/6f9857b6-6e00-43d6-b6f7-9b53ba446a1e/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk to [datastore2] vmware_temp/6f9857b6-6e00-43d6-b6f7-9b53ba446a1e/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk {{(pid=60024) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 703.415092] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-52ca320b-9c92-4b34-ae55-fdca50f90e0a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.423700] env[60024]: DEBUG oslo_vmware.api [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Waiting for the task: (returnval){ [ 703.423700] env[60024]: value = "task-4576251" [ 703.423700] env[60024]: _type = "Task" [ 703.423700] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 703.433319] env[60024]: DEBUG oslo_vmware.api [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Task: {'id': task-4576251, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 703.935304] env[60024]: DEBUG oslo_vmware.exceptions [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Fault InvalidArgument not matched. {{(pid=60024) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 703.935551] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 703.936094] env[60024]: ERROR nova.compute.manager [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 703.936094] env[60024]: Faults: ['InvalidArgument'] [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Traceback (most recent call last): [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] yield resources [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] self.driver.spawn(context, instance, image_meta, [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] self._vmops.spawn(context, instance, image_meta, injected_files, [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] self._fetch_image_if_missing(context, vi) [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] image_cache(vi, tmp_image_ds_loc) [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] vm_util.copy_virtual_disk( [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] session._wait_for_task(vmdk_copy_task) [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] return self.wait_for_task(task_ref) [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] return evt.wait() [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] result = hub.switch() [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] return self.greenlet.switch() [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] self.f(*self.args, **self.kw) [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] raise exceptions.translate_fault(task_info.error) [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Faults: ['InvalidArgument'] [ 703.936094] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] [ 703.937157] env[60024]: INFO nova.compute.manager [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Terminating instance [ 703.937969] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 703.938197] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 703.939032] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a048d041-b4ab-4410-8d23-c83cfe4de90d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.940523] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Acquiring lock "refresh_cache-68c87b51-b90a-47cc-bec1-05f7c389fc14" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 703.940686] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Acquired lock "refresh_cache-68c87b51-b90a-47cc-bec1-05f7c389fc14" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 703.940852] env[60024]: DEBUG nova.network.neutron [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 703.948381] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 703.948564] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 703.949794] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-114cb0fb-1426-4739-9236-63d7b948ddfa {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 703.958583] env[60024]: DEBUG oslo_vmware.api [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Waiting for the task: (returnval){ [ 703.958583] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]52a22ae0-f91d-4c44-bfa3-1cfabd7699fa" [ 703.958583] env[60024]: _type = "Task" [ 703.958583] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 703.971316] env[60024]: DEBUG oslo_vmware.api [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]52a22ae0-f91d-4c44-bfa3-1cfabd7699fa, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 703.975427] env[60024]: DEBUG nova.network.neutron [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 704.276652] env[60024]: DEBUG nova.network.neutron [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 704.285947] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Releasing lock "refresh_cache-68c87b51-b90a-47cc-bec1-05f7c389fc14" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 704.286390] env[60024]: DEBUG nova.compute.manager [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 704.286641] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 704.287728] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d960ac02-7ffd-4141-9e0e-a4c27b08352a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.296534] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 704.296765] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-816a5bd1-03b8-494c-b864-6c66f79f8237 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.333594] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 704.333874] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 704.334160] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Deleting the datastore file [datastore2] 68c87b51-b90a-47cc-bec1-05f7c389fc14 {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 704.334458] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4b6174d5-8a27-46a7-bb69-c4034dc183f2 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.341823] env[60024]: DEBUG oslo_vmware.api [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Waiting for the task: (returnval){ [ 704.341823] env[60024]: value = "task-4576253" [ 704.341823] env[60024]: _type = "Task" [ 704.341823] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 704.351046] env[60024]: DEBUG oslo_vmware.api [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Task: {'id': task-4576253, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 704.470541] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 704.470541] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Creating directory with path [datastore2] vmware_temp/d095bb69-7f08-4487-979e-049e54308123/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 704.470541] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-99674dd0-3a90-4a45-9cbe-07edb7bcb20a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.483545] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Created directory with path [datastore2] vmware_temp/d095bb69-7f08-4487-979e-049e54308123/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 704.483756] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Fetch image to [datastore2] vmware_temp/d095bb69-7f08-4487-979e-049e54308123/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 704.484061] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/d095bb69-7f08-4487-979e-049e54308123/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 704.484780] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-105d7a48-8a55-4147-bb04-3e856f4a973f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.492514] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef4cd0eb-60e4-40ba-86a7-4ca368370a44 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.502546] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0feed8ed-676f-42c5-9ed7-eb11eaa5826d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.535031] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19ce9818-6617-4384-aa2f-282f8abd6448 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.542210] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-46342208-30d0-4093-90ff-52a9b564bda6 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 704.573536] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 704.622254] env[60024]: DEBUG oslo_vmware.rw_handles [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d095bb69-7f08-4487-979e-049e54308123/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 704.680533] env[60024]: DEBUG oslo_vmware.rw_handles [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Completed reading data from the image iterator. {{(pid=60024) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 704.680533] env[60024]: DEBUG oslo_vmware.rw_handles [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d095bb69-7f08-4487-979e-049e54308123/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 704.852184] env[60024]: DEBUG oslo_vmware.api [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Task: {'id': task-4576253, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.035681} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 704.852437] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 704.852640] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 704.852834] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 704.852995] env[60024]: INFO nova.compute.manager [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Took 0.57 seconds to destroy the instance on the hypervisor. [ 704.853243] env[60024]: DEBUG oslo.service.loopingcall [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 704.853439] env[60024]: DEBUG nova.compute.manager [-] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Skipping network deallocation for instance since networking was not requested. {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 704.855537] env[60024]: DEBUG nova.compute.claims [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 704.855705] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 704.855916] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.134577] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dac1cb9f-0677-40f1-9b14-31ae79a855a0 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.142948] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af74748d-c47d-4503-a2e4-5332de2ffbfc {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.174320] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e765be3-be8b-48e8-a57a-2d09a53bd501 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.182104] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab729336-1df8-4fac-8180-1ca0d75f89fa {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.196076] env[60024]: DEBUG nova.compute.provider_tree [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 705.204385] env[60024]: DEBUG nova.scheduler.client.report [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 705.218156] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.362s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.218681] env[60024]: ERROR nova.compute.manager [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 705.218681] env[60024]: Faults: ['InvalidArgument'] [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Traceback (most recent call last): [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] self.driver.spawn(context, instance, image_meta, [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] self._vmops.spawn(context, instance, image_meta, injected_files, [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] self._fetch_image_if_missing(context, vi) [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] image_cache(vi, tmp_image_ds_loc) [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] vm_util.copy_virtual_disk( [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] session._wait_for_task(vmdk_copy_task) [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] return self.wait_for_task(task_ref) [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] return evt.wait() [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] result = hub.switch() [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] return self.greenlet.switch() [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] self.f(*self.args, **self.kw) [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] raise exceptions.translate_fault(task_info.error) [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Faults: ['InvalidArgument'] [ 705.218681] env[60024]: ERROR nova.compute.manager [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] [ 705.220028] env[60024]: DEBUG nova.compute.utils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] VimFaultException {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 705.220881] env[60024]: DEBUG nova.compute.manager [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Build of instance 68c87b51-b90a-47cc-bec1-05f7c389fc14 was re-scheduled: A specified parameter was not correct: fileType [ 705.220881] env[60024]: Faults: ['InvalidArgument'] {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 705.221280] env[60024]: DEBUG nova.compute.manager [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 705.221502] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Acquiring lock "refresh_cache-68c87b51-b90a-47cc-bec1-05f7c389fc14" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 705.221697] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Acquired lock "refresh_cache-68c87b51-b90a-47cc-bec1-05f7c389fc14" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 705.221859] env[60024]: DEBUG nova.network.neutron [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 705.248760] env[60024]: DEBUG nova.network.neutron [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 705.350297] env[60024]: DEBUG nova.network.neutron [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 705.361038] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Releasing lock "refresh_cache-68c87b51-b90a-47cc-bec1-05f7c389fc14" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 705.361319] env[60024]: DEBUG nova.compute.manager [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 705.361539] env[60024]: DEBUG nova.compute.manager [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] [instance: 68c87b51-b90a-47cc-bec1-05f7c389fc14] Skipping network deallocation for instance since networking was not requested. {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 705.464295] env[60024]: INFO nova.scheduler.client.report [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Deleted allocations for instance 68c87b51-b90a-47cc-bec1-05f7c389fc14 [ 705.484960] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b2f2d782-6e13-439a-b7a6-82d4614f8bd4 tempest-ServersAdmin275Test-1140959763 tempest-ServersAdmin275Test-1140959763-project-member] Lock "68c87b51-b90a-47cc-bec1-05f7c389fc14" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 136.584s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.498019] env[60024]: DEBUG nova.compute.manager [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 705.547066] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 705.547460] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.548957] env[60024]: INFO nova.compute.claims [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 705.879923] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e58c02a2-1db1-45a8-8c7e-110478271357 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.888939] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62674531-1af4-4a87-92af-be184d494b02 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.920991] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e059d513-4ca6-4db0-9ec1-a0b324fc90c2 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.929258] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52c341b5-e56b-48dd-a097-ebb27e877741 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.943487] env[60024]: DEBUG nova.compute.provider_tree [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 705.953310] env[60024]: DEBUG nova.scheduler.client.report [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 705.966920] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.419s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 705.967439] env[60024]: DEBUG nova.compute.manager [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 706.003829] env[60024]: DEBUG nova.compute.utils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 706.005263] env[60024]: DEBUG nova.compute.manager [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 706.005484] env[60024]: DEBUG nova.network.neutron [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 706.015163] env[60024]: DEBUG nova.compute.manager [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 706.063867] env[60024]: DEBUG nova.policy [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '41a535ff3d8148c29a9bf62afbeff688', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f42f07b8ce5040279223112427eb62c9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 706.085667] env[60024]: DEBUG nova.compute.manager [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 706.108127] env[60024]: DEBUG nova.virt.hardware [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 706.108387] env[60024]: DEBUG nova.virt.hardware [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 706.108544] env[60024]: DEBUG nova.virt.hardware [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 706.108724] env[60024]: DEBUG nova.virt.hardware [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 706.108872] env[60024]: DEBUG nova.virt.hardware [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 706.109044] env[60024]: DEBUG nova.virt.hardware [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 706.109276] env[60024]: DEBUG nova.virt.hardware [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 706.109438] env[60024]: DEBUG nova.virt.hardware [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 706.109602] env[60024]: DEBUG nova.virt.hardware [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 706.109785] env[60024]: DEBUG nova.virt.hardware [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 706.109921] env[60024]: DEBUG nova.virt.hardware [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 706.110788] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c85fc317-dfe0-4970-8bbd-ba74f1c31abc {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.119069] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03267f86-8a0f-473e-9a20-50e1cd29098d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.431186] env[60024]: DEBUG nova.network.neutron [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Successfully created port: aa044565-88e5-4dc1-a4ad-2a709936cbba {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 707.538367] env[60024]: DEBUG nova.compute.manager [req-fe17a2a4-17fe-4fd1-84a1-5164ce959a3c req-ee9c82d6-65ac-4313-a352-227faed9c088 service nova] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Received event network-vif-plugged-aa044565-88e5-4dc1-a4ad-2a709936cbba {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 707.538670] env[60024]: DEBUG oslo_concurrency.lockutils [req-fe17a2a4-17fe-4fd1-84a1-5164ce959a3c req-ee9c82d6-65ac-4313-a352-227faed9c088 service nova] Acquiring lock "08e2d758-9005-4822-b157-84710b9c5ed4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 707.538802] env[60024]: DEBUG oslo_concurrency.lockutils [req-fe17a2a4-17fe-4fd1-84a1-5164ce959a3c req-ee9c82d6-65ac-4313-a352-227faed9c088 service nova] Lock "08e2d758-9005-4822-b157-84710b9c5ed4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 707.538970] env[60024]: DEBUG oslo_concurrency.lockutils [req-fe17a2a4-17fe-4fd1-84a1-5164ce959a3c req-ee9c82d6-65ac-4313-a352-227faed9c088 service nova] Lock "08e2d758-9005-4822-b157-84710b9c5ed4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.539149] env[60024]: DEBUG nova.compute.manager [req-fe17a2a4-17fe-4fd1-84a1-5164ce959a3c req-ee9c82d6-65ac-4313-a352-227faed9c088 service nova] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] No waiting events found dispatching network-vif-plugged-aa044565-88e5-4dc1-a4ad-2a709936cbba {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 707.539313] env[60024]: WARNING nova.compute.manager [req-fe17a2a4-17fe-4fd1-84a1-5164ce959a3c req-ee9c82d6-65ac-4313-a352-227faed9c088 service nova] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Received unexpected event network-vif-plugged-aa044565-88e5-4dc1-a4ad-2a709936cbba for instance with vm_state building and task_state spawning. [ 707.770746] env[60024]: DEBUG nova.network.neutron [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Successfully updated port: aa044565-88e5-4dc1-a4ad-2a709936cbba {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 707.781682] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Acquiring lock "refresh_cache-08e2d758-9005-4822-b157-84710b9c5ed4" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 707.781825] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Acquired lock "refresh_cache-08e2d758-9005-4822-b157-84710b9c5ed4" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 707.781943] env[60024]: DEBUG nova.network.neutron [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 707.853229] env[60024]: DEBUG nova.network.neutron [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 708.224045] env[60024]: DEBUG nova.network.neutron [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Updating instance_info_cache with network_info: [{"id": "aa044565-88e5-4dc1-a4ad-2a709936cbba", "address": "fa:16:3e:e0:e0:ed", "network": {"id": "f62dc658-4a9c-41e6-bbf2-e3575c478501", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1202270755-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f42f07b8ce5040279223112427eb62c9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "61a172ee-af3f-473e-b12a-3fee5bf39c8d", "external-id": "nsx-vlan-transportzone-997", "segmentation_id": 997, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaa044565-88", "ovs_interfaceid": "aa044565-88e5-4dc1-a4ad-2a709936cbba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 708.238592] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Releasing lock "refresh_cache-08e2d758-9005-4822-b157-84710b9c5ed4" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 708.238795] env[60024]: DEBUG nova.compute.manager [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Instance network_info: |[{"id": "aa044565-88e5-4dc1-a4ad-2a709936cbba", "address": "fa:16:3e:e0:e0:ed", "network": {"id": "f62dc658-4a9c-41e6-bbf2-e3575c478501", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1202270755-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f42f07b8ce5040279223112427eb62c9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "61a172ee-af3f-473e-b12a-3fee5bf39c8d", "external-id": "nsx-vlan-transportzone-997", "segmentation_id": 997, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaa044565-88", "ovs_interfaceid": "aa044565-88e5-4dc1-a4ad-2a709936cbba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 708.239179] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e0:e0:ed', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '61a172ee-af3f-473e-b12a-3fee5bf39c8d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'aa044565-88e5-4dc1-a4ad-2a709936cbba', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 708.248025] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Creating folder: Project (f42f07b8ce5040279223112427eb62c9). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 708.248630] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-79b50c6b-8df2-49ff-9f9d-9a94732429f5 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.261604] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Created folder: Project (f42f07b8ce5040279223112427eb62c9) in parent group-v894073. [ 708.262641] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Creating folder: Instances. Parent ref: group-v894112. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 708.262914] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3f5978b2-a05e-4506-a000-281296cc7bb8 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.273548] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Created folder: Instances in parent group-v894112. [ 708.273804] env[60024]: DEBUG oslo.service.loopingcall [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 708.274008] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 708.274249] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b097ae08-5357-4b13-ac6b-a98e33dc139f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.297331] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 708.297331] env[60024]: value = "task-4576256" [ 708.297331] env[60024]: _type = "Task" [ 708.297331] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 708.307618] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576256, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 708.808543] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576256, 'name': CreateVM_Task, 'duration_secs': 0.332319} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 708.808767] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 708.809284] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 708.809446] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 708.809774] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 708.810075] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e08c6364-ec2f-4778-936f-a373810512ad {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.815435] env[60024]: DEBUG oslo_vmware.api [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Waiting for the task: (returnval){ [ 708.815435] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]5281a403-5f9a-6044-bf15-6b157eb03f83" [ 708.815435] env[60024]: _type = "Task" [ 708.815435] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 708.824148] env[60024]: DEBUG oslo_vmware.api [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]5281a403-5f9a-6044-bf15-6b157eb03f83, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 709.330807] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 709.331090] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 709.331426] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 709.983383] env[60024]: DEBUG nova.compute.manager [req-8ec336c2-fdb7-454e-9324-086ffb88aa14 req-a689861f-5857-415b-84cd-5daf5380c6bb service nova] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Received event network-changed-aa044565-88e5-4dc1-a4ad-2a709936cbba {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 709.983722] env[60024]: DEBUG nova.compute.manager [req-8ec336c2-fdb7-454e-9324-086ffb88aa14 req-a689861f-5857-415b-84cd-5daf5380c6bb service nova] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Refreshing instance network info cache due to event network-changed-aa044565-88e5-4dc1-a4ad-2a709936cbba. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 709.983801] env[60024]: DEBUG oslo_concurrency.lockutils [req-8ec336c2-fdb7-454e-9324-086ffb88aa14 req-a689861f-5857-415b-84cd-5daf5380c6bb service nova] Acquiring lock "refresh_cache-08e2d758-9005-4822-b157-84710b9c5ed4" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 709.983939] env[60024]: DEBUG oslo_concurrency.lockutils [req-8ec336c2-fdb7-454e-9324-086ffb88aa14 req-a689861f-5857-415b-84cd-5daf5380c6bb service nova] Acquired lock "refresh_cache-08e2d758-9005-4822-b157-84710b9c5ed4" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 709.984331] env[60024]: DEBUG nova.network.neutron [req-8ec336c2-fdb7-454e-9324-086ffb88aa14 req-a689861f-5857-415b-84cd-5daf5380c6bb service nova] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Refreshing network info cache for port aa044565-88e5-4dc1-a4ad-2a709936cbba {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 710.500794] env[60024]: DEBUG nova.network.neutron [req-8ec336c2-fdb7-454e-9324-086ffb88aa14 req-a689861f-5857-415b-84cd-5daf5380c6bb service nova] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Updated VIF entry in instance network info cache for port aa044565-88e5-4dc1-a4ad-2a709936cbba. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 710.500794] env[60024]: DEBUG nova.network.neutron [req-8ec336c2-fdb7-454e-9324-086ffb88aa14 req-a689861f-5857-415b-84cd-5daf5380c6bb service nova] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Updating instance_info_cache with network_info: [{"id": "aa044565-88e5-4dc1-a4ad-2a709936cbba", "address": "fa:16:3e:e0:e0:ed", "network": {"id": "f62dc658-4a9c-41e6-bbf2-e3575c478501", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1202270755-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f42f07b8ce5040279223112427eb62c9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "61a172ee-af3f-473e-b12a-3fee5bf39c8d", "external-id": "nsx-vlan-transportzone-997", "segmentation_id": 997, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaa044565-88", "ovs_interfaceid": "aa044565-88e5-4dc1-a4ad-2a709936cbba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 710.516448] env[60024]: DEBUG oslo_concurrency.lockutils [req-8ec336c2-fdb7-454e-9324-086ffb88aa14 req-a689861f-5857-415b-84cd-5daf5380c6bb service nova] Releasing lock "refresh_cache-08e2d758-9005-4822-b157-84710b9c5ed4" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 751.436245] env[60024]: WARNING oslo_vmware.rw_handles [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 751.436245] env[60024]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 751.436245] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 751.436245] env[60024]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 751.436245] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 751.436245] env[60024]: ERROR oslo_vmware.rw_handles response.begin() [ 751.436245] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 751.436245] env[60024]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 751.436245] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 751.436245] env[60024]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 751.436245] env[60024]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 751.436245] env[60024]: ERROR oslo_vmware.rw_handles [ 751.436867] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Downloaded image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to vmware_temp/d095bb69-7f08-4487-979e-049e54308123/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 751.438630] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Caching image {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 751.438891] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Copying Virtual Disk [datastore2] vmware_temp/d095bb69-7f08-4487-979e-049e54308123/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk to [datastore2] vmware_temp/d095bb69-7f08-4487-979e-049e54308123/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk {{(pid=60024) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 751.439402] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-14e2caa6-dc53-4ec1-b3d9-327d56118a55 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.448264] env[60024]: DEBUG oslo_vmware.api [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Waiting for the task: (returnval){ [ 751.448264] env[60024]: value = "task-4576257" [ 751.448264] env[60024]: _type = "Task" [ 751.448264] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 751.456806] env[60024]: DEBUG oslo_vmware.api [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Task: {'id': task-4576257, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 751.960254] env[60024]: DEBUG oslo_vmware.exceptions [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Fault InvalidArgument not matched. {{(pid=60024) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 751.960254] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 751.960254] env[60024]: ERROR nova.compute.manager [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 751.960254] env[60024]: Faults: ['InvalidArgument'] [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Traceback (most recent call last): [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] yield resources [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] self.driver.spawn(context, instance, image_meta, [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] self._vmops.spawn(context, instance, image_meta, injected_files, [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] self._fetch_image_if_missing(context, vi) [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] image_cache(vi, tmp_image_ds_loc) [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] vm_util.copy_virtual_disk( [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] session._wait_for_task(vmdk_copy_task) [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] return self.wait_for_task(task_ref) [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] return evt.wait() [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] result = hub.switch() [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] return self.greenlet.switch() [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 751.960254] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] self.f(*self.args, **self.kw) [ 751.961461] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 751.961461] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] raise exceptions.translate_fault(task_info.error) [ 751.961461] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 751.961461] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Faults: ['InvalidArgument'] [ 751.961461] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] [ 751.961461] env[60024]: INFO nova.compute.manager [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Terminating instance [ 751.961649] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 751.961805] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 751.962089] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-14406095-f2f6-450c-932e-e64fc770e03d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.964324] env[60024]: DEBUG nova.compute.manager [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 751.964521] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 751.965243] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09026a8a-5614-4f6a-ae81-8a5e1c2c9e95 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.973739] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 751.973883] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-67ff01e6-052c-41c5-a8a2-2e59adf5a816 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.976116] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 751.976289] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 751.977219] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e7eee855-30d8-4bcb-8078-a2ec13670bb7 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 751.982423] env[60024]: DEBUG oslo_vmware.api [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Waiting for the task: (returnval){ [ 751.982423] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]524e508a-68ff-56b4-1533-d7ca23384c1b" [ 751.982423] env[60024]: _type = "Task" [ 751.982423] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 751.994671] env[60024]: DEBUG oslo_vmware.api [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]524e508a-68ff-56b4-1533-d7ca23384c1b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 752.045887] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 752.046167] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 752.046369] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Deleting the datastore file [datastore2] c4400e80-4457-4a8a-8588-f594e5993cde {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 752.046718] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a3d1791c-91ab-4133-97bf-ab18140822e6 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.054511] env[60024]: DEBUG oslo_vmware.api [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Waiting for the task: (returnval){ [ 752.054511] env[60024]: value = "task-4576259" [ 752.054511] env[60024]: _type = "Task" [ 752.054511] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 752.062662] env[60024]: DEBUG oslo_vmware.api [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Task: {'id': task-4576259, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 752.493151] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 752.493511] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Creating directory with path [datastore2] vmware_temp/5a655855-04eb-42cb-9106-41d56242d844/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 752.493748] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fa67f495-77ee-415d-951c-011e760a6c0f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.509438] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Created directory with path [datastore2] vmware_temp/5a655855-04eb-42cb-9106-41d56242d844/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 752.509660] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Fetch image to [datastore2] vmware_temp/5a655855-04eb-42cb-9106-41d56242d844/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 752.509831] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/5a655855-04eb-42cb-9106-41d56242d844/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 752.510706] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b5da617-4eb1-4a35-8b2c-79af6c967d89 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.522874] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8af732f0-3d0b-4d84-9a42-79cbc243096e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.533562] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0505db4a-6936-40e3-bfd4-88ff7dad7161 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.567156] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91791133-d632-476d-82eb-5a4e5ba9aa8d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.576786] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-db229041-e215-4e26-b8e5-068e3730c070 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.578567] env[60024]: DEBUG oslo_vmware.api [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Task: {'id': task-4576259, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070292} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 752.578792] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 752.578973] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 752.579199] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 752.579382] env[60024]: INFO nova.compute.manager [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Took 0.61 seconds to destroy the instance on the hypervisor. [ 752.581802] env[60024]: DEBUG nova.compute.claims [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 752.582031] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 752.582276] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 752.604731] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 752.667751] env[60024]: DEBUG oslo_vmware.rw_handles [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5a655855-04eb-42cb-9106-41d56242d844/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 752.727143] env[60024]: DEBUG oslo_vmware.rw_handles [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Completed reading data from the image iterator. {{(pid=60024) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 752.727251] env[60024]: DEBUG oslo_vmware.rw_handles [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5a655855-04eb-42cb-9106-41d56242d844/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 752.929023] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8303958e-e6df-4774-a135-8f72a573d200 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.938241] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-825deca9-2650-4f37-8209-32fa4459f0f4 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.968798] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f856ef4-493b-4b30-bd3a-40d75ba58b1d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.976869] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00c261b7-af11-4c89-890e-c865726cce8d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 752.990663] env[60024]: DEBUG nova.compute.provider_tree [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 752.999593] env[60024]: DEBUG nova.scheduler.client.report [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 753.013171] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.431s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.013747] env[60024]: ERROR nova.compute.manager [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 753.013747] env[60024]: Faults: ['InvalidArgument'] [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Traceback (most recent call last): [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] self.driver.spawn(context, instance, image_meta, [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] self._vmops.spawn(context, instance, image_meta, injected_files, [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] self._fetch_image_if_missing(context, vi) [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] image_cache(vi, tmp_image_ds_loc) [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] vm_util.copy_virtual_disk( [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] session._wait_for_task(vmdk_copy_task) [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] return self.wait_for_task(task_ref) [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] return evt.wait() [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] result = hub.switch() [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] return self.greenlet.switch() [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] self.f(*self.args, **self.kw) [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] raise exceptions.translate_fault(task_info.error) [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Faults: ['InvalidArgument'] [ 753.013747] env[60024]: ERROR nova.compute.manager [instance: c4400e80-4457-4a8a-8588-f594e5993cde] [ 753.014576] env[60024]: DEBUG nova.compute.utils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] VimFaultException {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 753.015909] env[60024]: DEBUG nova.compute.manager [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Build of instance c4400e80-4457-4a8a-8588-f594e5993cde was re-scheduled: A specified parameter was not correct: fileType [ 753.015909] env[60024]: Faults: ['InvalidArgument'] {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 753.016283] env[60024]: DEBUG nova.compute.manager [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 753.016459] env[60024]: DEBUG nova.compute.manager [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 753.016624] env[60024]: DEBUG nova.compute.manager [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 753.016879] env[60024]: DEBUG nova.network.neutron [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 753.255303] env[60024]: DEBUG nova.network.neutron [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 753.266333] env[60024]: INFO nova.compute.manager [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] [instance: c4400e80-4457-4a8a-8588-f594e5993cde] Took 0.25 seconds to deallocate network for instance. [ 753.372420] env[60024]: INFO nova.scheduler.client.report [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Deleted allocations for instance c4400e80-4457-4a8a-8588-f594e5993cde [ 753.395664] env[60024]: DEBUG oslo_concurrency.lockutils [None req-268f4dc5-3f67-4434-af1a-623b52e88f81 tempest-ServerDiagnosticsNegativeTest-1478228549 tempest-ServerDiagnosticsNegativeTest-1478228549-project-member] Lock "c4400e80-4457-4a8a-8588-f594e5993cde" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 186.965s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.426397] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 753.477393] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 753.477954] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 753.479701] env[60024]: INFO nova.compute.claims [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 753.778574] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0fba278-fd3b-4d7c-bbbe-509ace06ae21 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.787680] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c2d4662-cdc6-4a85-a81e-cbfbf1546fee {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.818167] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68b69edf-8b50-45f9-98cb-412fb5ec4607 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.826736] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bdbfc47-80cd-42b3-bb6b-d84579b1b18b {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 753.840718] env[60024]: DEBUG nova.compute.provider_tree [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 753.849205] env[60024]: DEBUG nova.scheduler.client.report [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 753.864009] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.386s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 753.864599] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 753.898464] env[60024]: DEBUG nova.compute.utils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 753.899771] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 753.899963] env[60024]: DEBUG nova.network.neutron [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 753.908339] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 753.974473] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 753.997846] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 753.998099] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 753.998261] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 753.998444] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 753.998587] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 753.998729] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 753.999055] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 753.999118] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 753.999260] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 753.999419] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 753.999588] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 754.000471] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fe5cb12-c74c-4a54-9b1d-9721c8e911a8 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.007069] env[60024]: DEBUG nova.policy [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff2598db3b974d7685d57094808f2ef8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6c6cba030fa2464f98c773682138ae9c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 754.011921] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56cb49bf-45d8-4e47-a56f-5619e718b094 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 754.478468] env[60024]: DEBUG nova.network.neutron [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Successfully created port: cae1a3f6-5e7d-4356-ad0d-b269f2feb62a {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 755.572395] env[60024]: DEBUG nova.network.neutron [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Successfully updated port: cae1a3f6-5e7d-4356-ad0d-b269f2feb62a {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 755.584590] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquiring lock "refresh_cache-54919bf0-b9f3-4bfc-ba1a-c6a52013e351" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 755.584936] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquired lock "refresh_cache-54919bf0-b9f3-4bfc-ba1a-c6a52013e351" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 755.584936] env[60024]: DEBUG nova.network.neutron [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 755.611532] env[60024]: DEBUG nova.compute.manager [req-a6af69e6-8b4f-45e5-a4cf-7a4016549efe req-a2f7ef6e-95fa-411c-95c0-337eb8c1dfe2 service nova] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Received event network-vif-plugged-cae1a3f6-5e7d-4356-ad0d-b269f2feb62a {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 755.612651] env[60024]: DEBUG oslo_concurrency.lockutils [req-a6af69e6-8b4f-45e5-a4cf-7a4016549efe req-a2f7ef6e-95fa-411c-95c0-337eb8c1dfe2 service nova] Acquiring lock "54919bf0-b9f3-4bfc-ba1a-c6a52013e351-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 755.612651] env[60024]: DEBUG oslo_concurrency.lockutils [req-a6af69e6-8b4f-45e5-a4cf-7a4016549efe req-a2f7ef6e-95fa-411c-95c0-337eb8c1dfe2 service nova] Lock "54919bf0-b9f3-4bfc-ba1a-c6a52013e351-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 755.612651] env[60024]: DEBUG oslo_concurrency.lockutils [req-a6af69e6-8b4f-45e5-a4cf-7a4016549efe req-a2f7ef6e-95fa-411c-95c0-337eb8c1dfe2 service nova] Lock "54919bf0-b9f3-4bfc-ba1a-c6a52013e351-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 755.613021] env[60024]: DEBUG nova.compute.manager [req-a6af69e6-8b4f-45e5-a4cf-7a4016549efe req-a2f7ef6e-95fa-411c-95c0-337eb8c1dfe2 service nova] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] No waiting events found dispatching network-vif-plugged-cae1a3f6-5e7d-4356-ad0d-b269f2feb62a {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 755.613220] env[60024]: WARNING nova.compute.manager [req-a6af69e6-8b4f-45e5-a4cf-7a4016549efe req-a2f7ef6e-95fa-411c-95c0-337eb8c1dfe2 service nova] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Received unexpected event network-vif-plugged-cae1a3f6-5e7d-4356-ad0d-b269f2feb62a for instance with vm_state building and task_state spawning. [ 755.695015] env[60024]: DEBUG nova.network.neutron [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 755.952465] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 756.340341] env[60024]: DEBUG nova.network.neutron [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Updating instance_info_cache with network_info: [{"id": "cae1a3f6-5e7d-4356-ad0d-b269f2feb62a", "address": "fa:16:3e:72:6c:ed", "network": {"id": "558ef47b-e753-425f-8df2-741c798b60a9", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-860543556-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6c6cba030fa2464f98c773682138ae9c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcae1a3f6-5e", "ovs_interfaceid": "cae1a3f6-5e7d-4356-ad0d-b269f2feb62a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 756.341624] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 756.341810] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 756.355759] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Releasing lock "refresh_cache-54919bf0-b9f3-4bfc-ba1a-c6a52013e351" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 756.355847] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Instance network_info: |[{"id": "cae1a3f6-5e7d-4356-ad0d-b269f2feb62a", "address": "fa:16:3e:72:6c:ed", "network": {"id": "558ef47b-e753-425f-8df2-741c798b60a9", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-860543556-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6c6cba030fa2464f98c773682138ae9c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcae1a3f6-5e", "ovs_interfaceid": "cae1a3f6-5e7d-4356-ad0d-b269f2feb62a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 756.356187] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:72:6c:ed', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ff1f3320-df8e-49df-a412-9797a23bd173', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cae1a3f6-5e7d-4356-ad0d-b269f2feb62a', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 756.363958] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Creating folder: Project (6c6cba030fa2464f98c773682138ae9c). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 756.364559] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2f46225c-fc11-4d77-82a7-6f5b57490865 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.377488] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Created folder: Project (6c6cba030fa2464f98c773682138ae9c) in parent group-v894073. [ 756.378825] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Creating folder: Instances. Parent ref: group-v894115. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 756.378825] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2b7977fa-6627-47dd-b071-fdc14470f125 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.389720] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Created folder: Instances in parent group-v894115. [ 756.389992] env[60024]: DEBUG oslo.service.loopingcall [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 756.390211] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 756.390420] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2339f029-cfb4-4637-9b8c-84f4263e2e99 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.414058] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 756.414058] env[60024]: value = "task-4576262" [ 756.414058] env[60024]: _type = "Task" [ 756.414058] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 756.426323] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576262, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 756.930023] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576262, 'name': CreateVM_Task, 'duration_secs': 0.333645} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 756.930023] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 756.930023] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 756.930023] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 756.930023] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 756.930023] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2fec7ad2-f533-44a4-9b1d-e87ddbcc6891 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.934694] env[60024]: DEBUG oslo_vmware.api [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Waiting for the task: (returnval){ [ 756.934694] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]521b6533-63b8-3e70-60cb-2ff0f3f4f7b1" [ 756.934694] env[60024]: _type = "Task" [ 756.934694] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 756.944102] env[60024]: DEBUG oslo_vmware.api [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]521b6533-63b8-3e70-60cb-2ff0f3f4f7b1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 757.341635] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 757.341831] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Starting heal instance info cache {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 757.341958] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Rebuilding the list of instances to heal {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 757.365796] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 757.365966] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 757.366257] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 757.366394] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 757.366523] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 757.366666] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 757.366848] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 757.367137] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 757.367137] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 757.367234] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 757.367349] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Didn't find any instances for network info cache update. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 757.367846] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 757.368036] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 757.449822] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 757.450229] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 757.450326] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 757.674610] env[60024]: DEBUG nova.compute.manager [req-9810beb6-1597-41af-a2ca-937d5c9eff80 req-b24f9857-3968-4ea4-9c08-cb3329ebab4c service nova] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Received event network-changed-cae1a3f6-5e7d-4356-ad0d-b269f2feb62a {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 757.674803] env[60024]: DEBUG nova.compute.manager [req-9810beb6-1597-41af-a2ca-937d5c9eff80 req-b24f9857-3968-4ea4-9c08-cb3329ebab4c service nova] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Refreshing instance network info cache due to event network-changed-cae1a3f6-5e7d-4356-ad0d-b269f2feb62a. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 757.675028] env[60024]: DEBUG oslo_concurrency.lockutils [req-9810beb6-1597-41af-a2ca-937d5c9eff80 req-b24f9857-3968-4ea4-9c08-cb3329ebab4c service nova] Acquiring lock "refresh_cache-54919bf0-b9f3-4bfc-ba1a-c6a52013e351" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 757.675185] env[60024]: DEBUG oslo_concurrency.lockutils [req-9810beb6-1597-41af-a2ca-937d5c9eff80 req-b24f9857-3968-4ea4-9c08-cb3329ebab4c service nova] Acquired lock "refresh_cache-54919bf0-b9f3-4bfc-ba1a-c6a52013e351" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 757.675340] env[60024]: DEBUG nova.network.neutron [req-9810beb6-1597-41af-a2ca-937d5c9eff80 req-b24f9857-3968-4ea4-9c08-cb3329ebab4c service nova] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Refreshing network info cache for port cae1a3f6-5e7d-4356-ad0d-b269f2feb62a {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 758.231583] env[60024]: DEBUG nova.network.neutron [req-9810beb6-1597-41af-a2ca-937d5c9eff80 req-b24f9857-3968-4ea4-9c08-cb3329ebab4c service nova] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Updated VIF entry in instance network info cache for port cae1a3f6-5e7d-4356-ad0d-b269f2feb62a. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 758.231583] env[60024]: DEBUG nova.network.neutron [req-9810beb6-1597-41af-a2ca-937d5c9eff80 req-b24f9857-3968-4ea4-9c08-cb3329ebab4c service nova] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Updating instance_info_cache with network_info: [{"id": "cae1a3f6-5e7d-4356-ad0d-b269f2feb62a", "address": "fa:16:3e:72:6c:ed", "network": {"id": "558ef47b-e753-425f-8df2-741c798b60a9", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-860543556-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6c6cba030fa2464f98c773682138ae9c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcae1a3f6-5e", "ovs_interfaceid": "cae1a3f6-5e7d-4356-ad0d-b269f2feb62a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 758.241489] env[60024]: DEBUG oslo_concurrency.lockutils [req-9810beb6-1597-41af-a2ca-937d5c9eff80 req-b24f9857-3968-4ea4-9c08-cb3329ebab4c service nova] Releasing lock "refresh_cache-54919bf0-b9f3-4bfc-ba1a-c6a52013e351" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 758.341017] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 758.343347] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 758.343347] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 758.360280] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 758.360280] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 758.360280] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 758.360280] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60024) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 758.362079] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc2a0e91-b3ad-477c-a763-b1529ee940f1 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.373880] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3119966c-c2f8-459b-b773-e89e85af4aeb {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.391469] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e4682af-cb79-4b0c-b2a3-00fdf315d9c9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.400839] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-493110c0-6b90-4a80-98d2-d359c90a98e8 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.432331] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180649MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60024) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 758.432497] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 758.432704] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 758.504087] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 7a4778b7-5ffc-4641-b968-d0304fd67ee0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 758.504276] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 37916d26-1b5e-4991-83a2-ca5a5b00c2ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 758.504406] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 363f5261-d589-4f99-b7dd-ab8f16cefee3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 758.504531] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance ce222a29-3611-45b3-9664-87ae2fb1b1b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 758.504654] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 9214a18f-c22d-4e24-980e-7241a2b993bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 758.504772] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 3ab1b905-cd6f-4d2b-a244-f85e56f796d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 758.504898] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 036d6de2-f69b-4714-b89e-9c4307253675 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 758.504996] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 5888cc9f-7341-4f9c-a93c-dd5ec95f7369 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 758.505128] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 08e2d758-9005-4822-b157-84710b9c5ed4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 758.505246] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 54919bf0-b9f3-4bfc-ba1a-c6a52013e351 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 758.521948] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance a925d5fc-6437-40bb-adf1-ea10c32dde2a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 758.534897] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 076c3dd5-9043-456d-af24-0d2273321085 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 758.545537] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 1eaf8e02-bfb0-4928-9687-cc781a84d16d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 758.557122] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance c841298b-f103-4dc7-8884-efdf2ebc20a6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 758.567740] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance bf70d23b-4ab5-476e-814c-264b6a9f2455 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 758.578892] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 67ff7d52-6e30-4730-9b5a-9ae32f68b953 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 758.609636] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance bd91e947-acae-4dbd-b48b-5a6727eb4cbb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 758.622456] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 95358801-c9d8-4582-a712-36a8bf586456 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 758.636515] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance be6a4290-dbb3-4e1f-bdd4-0dc106db9435 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 758.636772] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 758.636919] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=100GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 758.928016] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b366ab54-5829-4b70-91b4-fa0ea06cefb8 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.936213] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d98c25b-dd91-4444-b052-81b36813f38a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.974740] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ea3c9ac-5da0-427d-b259-38f20614483c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.983622] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3499ddf-bae5-49e1-86f2-416c93f637bc {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.998118] env[60024]: DEBUG nova.compute.provider_tree [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 759.013813] env[60024]: DEBUG nova.scheduler.client.report [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 759.026959] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60024) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 759.027322] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.595s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 760.027621] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 760.027933] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60024) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 767.519216] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Acquiring lock "fcf47169-eb7a-4644-bf3f-7150c44c247f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 767.519583] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Lock "fcf47169-eb7a-4644-bf3f-7150c44c247f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 768.439238] env[60024]: DEBUG oslo_concurrency.lockutils [None req-4a00afdc-ccd3-4d69-8364-579e565b6c11 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquiring lock "7a4778b7-5ffc-4641-b968-d0304fd67ee0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 770.665453] env[60024]: DEBUG oslo_concurrency.lockutils [None req-c1e49fd7-710f-4938-9c5f-2d292e3c1b4e tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Acquiring lock "37916d26-1b5e-4991-83a2-ca5a5b00c2ac" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 772.169802] env[60024]: DEBUG oslo_concurrency.lockutils [None req-d6fa5508-bfde-475d-b929-f7cafd4464d0 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Acquiring lock "363f5261-d589-4f99-b7dd-ab8f16cefee3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 778.905889] env[60024]: DEBUG oslo_concurrency.lockutils [None req-d7442ef0-80a3-46a4-b809-0339523a88a1 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Acquiring lock "9214a18f-c22d-4e24-980e-7241a2b993bd" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 780.041993] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f3ed154b-4ecd-4d42-9310-d1d2294386f0 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquiring lock "3ab1b905-cd6f-4d2b-a244-f85e56f796d3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 780.675487] env[60024]: DEBUG oslo_concurrency.lockutils [None req-46b464b6-d994-4738-a3f3-32f7541088fe tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Acquiring lock "036d6de2-f69b-4714-b89e-9c4307253675" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 781.138670] env[60024]: DEBUG oslo_concurrency.lockutils [None req-4d962258-7b79-4028-86dc-e7e389009bc0 tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Acquiring lock "5888cc9f-7341-4f9c-a93c-dd5ec95f7369" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 801.454294] env[60024]: WARNING oslo_vmware.rw_handles [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 801.454294] env[60024]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 801.454294] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 801.454294] env[60024]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 801.454294] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 801.454294] env[60024]: ERROR oslo_vmware.rw_handles response.begin() [ 801.454294] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 801.454294] env[60024]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 801.454294] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 801.454294] env[60024]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 801.454294] env[60024]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 801.454294] env[60024]: ERROR oslo_vmware.rw_handles [ 801.454985] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Downloaded image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to vmware_temp/5a655855-04eb-42cb-9106-41d56242d844/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 801.456311] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Caching image {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 801.456568] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Copying Virtual Disk [datastore2] vmware_temp/5a655855-04eb-42cb-9106-41d56242d844/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk to [datastore2] vmware_temp/5a655855-04eb-42cb-9106-41d56242d844/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk {{(pid=60024) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 801.456836] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1f726d88-4b1f-480d-8d54-3ec3998365b8 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.466057] env[60024]: DEBUG oslo_vmware.api [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Waiting for the task: (returnval){ [ 801.466057] env[60024]: value = "task-4576263" [ 801.466057] env[60024]: _type = "Task" [ 801.466057] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 801.474126] env[60024]: DEBUG oslo_vmware.api [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Task: {'id': task-4576263, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 801.976960] env[60024]: DEBUG oslo_vmware.exceptions [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Fault InvalidArgument not matched. {{(pid=60024) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 801.977193] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 801.978022] env[60024]: ERROR nova.compute.manager [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 801.978022] env[60024]: Faults: ['InvalidArgument'] [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Traceback (most recent call last): [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] yield resources [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] self.driver.spawn(context, instance, image_meta, [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] self._fetch_image_if_missing(context, vi) [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] image_cache(vi, tmp_image_ds_loc) [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] vm_util.copy_virtual_disk( [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] session._wait_for_task(vmdk_copy_task) [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] return self.wait_for_task(task_ref) [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] return evt.wait() [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] result = hub.switch() [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] return self.greenlet.switch() [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] self.f(*self.args, **self.kw) [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] raise exceptions.translate_fault(task_info.error) [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Faults: ['InvalidArgument'] [ 801.978022] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] [ 801.979078] env[60024]: INFO nova.compute.manager [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Terminating instance [ 801.980080] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 801.980297] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 801.980612] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c5a6b736-ab1c-415e-a655-c3cf69dc5b01 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.983125] env[60024]: DEBUG nova.compute.manager [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 801.983323] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 801.984108] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73a3e225-7f18-4d64-a9b2-da55e8ab52ab {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.992273] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 801.992571] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-78d25897-5987-4ac9-98b8-48ff5a051616 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 801.995962] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 801.995962] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 801.996605] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1ef50471-6b1c-49cc-9a33-8083fdb36c77 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.002284] env[60024]: DEBUG oslo_vmware.api [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Waiting for the task: (returnval){ [ 802.002284] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]522d36fd-53a5-b61d-0298-7c185420c36d" [ 802.002284] env[60024]: _type = "Task" [ 802.002284] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 802.011111] env[60024]: DEBUG oslo_vmware.api [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]522d36fd-53a5-b61d-0298-7c185420c36d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 802.073294] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 802.073567] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 802.073768] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Deleting the datastore file [datastore2] 7a4778b7-5ffc-4641-b968-d0304fd67ee0 {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 802.074049] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-af657bff-bc9e-40aa-8286-a0bc504bc2cf {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.081695] env[60024]: DEBUG oslo_vmware.api [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Waiting for the task: (returnval){ [ 802.081695] env[60024]: value = "task-4576265" [ 802.081695] env[60024]: _type = "Task" [ 802.081695] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 802.090366] env[60024]: DEBUG oslo_vmware.api [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Task: {'id': task-4576265, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 802.513349] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 802.513792] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Creating directory with path [datastore2] vmware_temp/5552e8dc-95d7-4584-984f-30243e474d6e/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 802.513890] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-70afff32-8536-452d-a9ec-648cbcea7c19 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.527373] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Created directory with path [datastore2] vmware_temp/5552e8dc-95d7-4584-984f-30243e474d6e/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 802.527573] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Fetch image to [datastore2] vmware_temp/5552e8dc-95d7-4584-984f-30243e474d6e/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 802.527740] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/5552e8dc-95d7-4584-984f-30243e474d6e/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 802.528511] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86af8b33-8fbe-4350-9493-dcd4b9811ee6 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.535798] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8532eb6b-0a75-400e-aca8-50279f50e464 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.545062] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55e74126-3398-406b-8644-d97b4a9ac413 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.578466] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b54c6874-197e-4115-a988-5fbc5bf59214 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.599396] env[60024]: DEBUG oslo_vmware.api [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Task: {'id': task-4576265, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070775} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 802.599719] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b7c7fa26-c9db-40ee-9e1d-f2c560ac4648 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.602792] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 802.602946] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 802.603156] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 802.603462] env[60024]: INFO nova.compute.manager [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Took 0.62 seconds to destroy the instance on the hypervisor. [ 802.607502] env[60024]: DEBUG nova.compute.claims [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 802.607699] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 802.607932] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 802.632049] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 802.684180] env[60024]: DEBUG oslo_vmware.rw_handles [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5552e8dc-95d7-4584-984f-30243e474d6e/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 802.744018] env[60024]: DEBUG oslo_vmware.rw_handles [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Completed reading data from the image iterator. {{(pid=60024) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 802.744018] env[60024]: DEBUG oslo_vmware.rw_handles [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5552e8dc-95d7-4584-984f-30243e474d6e/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 802.952984] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ab9156c-59ba-4f63-8d25-bb655bcc7cef {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.960805] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-900af0d5-9d5e-4822-9718-8ea2dff56843 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.993806] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13c844ee-68a2-450b-835e-5aab9bfe442c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.002008] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfb3f9f2-cccb-4f4e-b866-3303d4f935bb {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.015716] env[60024]: DEBUG nova.compute.provider_tree [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 803.025323] env[60024]: DEBUG nova.scheduler.client.report [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 803.039846] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.432s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 803.040441] env[60024]: ERROR nova.compute.manager [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 803.040441] env[60024]: Faults: ['InvalidArgument'] [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Traceback (most recent call last): [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] self.driver.spawn(context, instance, image_meta, [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] self._fetch_image_if_missing(context, vi) [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] image_cache(vi, tmp_image_ds_loc) [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] vm_util.copy_virtual_disk( [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] session._wait_for_task(vmdk_copy_task) [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] return self.wait_for_task(task_ref) [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] return evt.wait() [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] result = hub.switch() [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] return self.greenlet.switch() [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] self.f(*self.args, **self.kw) [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] raise exceptions.translate_fault(task_info.error) [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Faults: ['InvalidArgument'] [ 803.040441] env[60024]: ERROR nova.compute.manager [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] [ 803.041515] env[60024]: DEBUG nova.compute.utils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] VimFaultException {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 803.043051] env[60024]: DEBUG nova.compute.manager [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Build of instance 7a4778b7-5ffc-4641-b968-d0304fd67ee0 was re-scheduled: A specified parameter was not correct: fileType [ 803.043051] env[60024]: Faults: ['InvalidArgument'] {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 803.043702] env[60024]: DEBUG nova.compute.manager [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 803.043702] env[60024]: DEBUG nova.compute.manager [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 803.043898] env[60024]: DEBUG nova.compute.manager [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 803.044118] env[60024]: DEBUG nova.network.neutron [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 803.555722] env[60024]: DEBUG nova.network.neutron [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 803.571595] env[60024]: INFO nova.compute.manager [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Took 0.53 seconds to deallocate network for instance. [ 803.661520] env[60024]: INFO nova.scheduler.client.report [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Deleted allocations for instance 7a4778b7-5ffc-4641-b968-d0304fd67ee0 [ 803.694779] env[60024]: DEBUG oslo_concurrency.lockutils [None req-70d4942f-629a-46a2-bfe7-7decf2e3298a tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Lock "7a4778b7-5ffc-4641-b968-d0304fd67ee0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 234.525s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 803.695757] env[60024]: DEBUG oslo_concurrency.lockutils [None req-4a00afdc-ccd3-4d69-8364-579e565b6c11 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Lock "7a4778b7-5ffc-4641-b968-d0304fd67ee0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 35.257s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 803.695993] env[60024]: DEBUG oslo_concurrency.lockutils [None req-4a00afdc-ccd3-4d69-8364-579e565b6c11 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquiring lock "7a4778b7-5ffc-4641-b968-d0304fd67ee0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 803.696907] env[60024]: DEBUG oslo_concurrency.lockutils [None req-4a00afdc-ccd3-4d69-8364-579e565b6c11 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Lock "7a4778b7-5ffc-4641-b968-d0304fd67ee0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 803.696907] env[60024]: DEBUG oslo_concurrency.lockutils [None req-4a00afdc-ccd3-4d69-8364-579e565b6c11 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Lock "7a4778b7-5ffc-4641-b968-d0304fd67ee0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 803.698509] env[60024]: INFO nova.compute.manager [None req-4a00afdc-ccd3-4d69-8364-579e565b6c11 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Terminating instance [ 803.700236] env[60024]: DEBUG nova.compute.manager [None req-4a00afdc-ccd3-4d69-8364-579e565b6c11 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 803.700428] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-4a00afdc-ccd3-4d69-8364-579e565b6c11 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 803.700895] env[60024]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cf5413fe-4e93-4a72-8ea2-a5296d9ce8a7 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.711030] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-645d9354-e9e7-4644-ac05-4085b6685a5c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.722485] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 803.746656] env[60024]: WARNING nova.virt.vmwareapi.vmops [None req-4a00afdc-ccd3-4d69-8364-579e565b6c11 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7a4778b7-5ffc-4641-b968-d0304fd67ee0 could not be found. [ 803.746860] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-4a00afdc-ccd3-4d69-8364-579e565b6c11 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 803.747043] env[60024]: INFO nova.compute.manager [None req-4a00afdc-ccd3-4d69-8364-579e565b6c11 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Took 0.05 seconds to destroy the instance on the hypervisor. [ 803.747288] env[60024]: DEBUG oslo.service.loopingcall [None req-4a00afdc-ccd3-4d69-8364-579e565b6c11 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 803.747506] env[60024]: DEBUG nova.compute.manager [-] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 803.747614] env[60024]: DEBUG nova.network.neutron [-] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 803.784298] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 803.784553] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 803.786011] env[60024]: INFO nova.compute.claims [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 803.789785] env[60024]: DEBUG nova.network.neutron [-] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 803.797568] env[60024]: INFO nova.compute.manager [-] [instance: 7a4778b7-5ffc-4641-b968-d0304fd67ee0] Took 0.05 seconds to deallocate network for instance. [ 803.915674] env[60024]: DEBUG oslo_concurrency.lockutils [None req-4a00afdc-ccd3-4d69-8364-579e565b6c11 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Lock "7a4778b7-5ffc-4641-b968-d0304fd67ee0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.220s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 804.122184] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c98905c0-0c7e-4451-b383-3a14c9ea276c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.130505] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4428aaf-5acc-4023-bc9c-827cfa3d2929 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.163789] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-604b77a5-3536-46a6-80b4-d891e4959663 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.172289] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8a86ffa-9fc9-40fb-b3c2-de8f40a07a5c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.187141] env[60024]: DEBUG nova.compute.provider_tree [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 804.195921] env[60024]: DEBUG nova.scheduler.client.report [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 804.212020] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.427s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 804.212020] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 804.243273] env[60024]: DEBUG nova.compute.utils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 804.247200] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 804.247336] env[60024]: DEBUG nova.network.neutron [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 804.256934] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 804.326812] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 804.338841] env[60024]: DEBUG nova.policy [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff2598db3b974d7685d57094808f2ef8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6c6cba030fa2464f98c773682138ae9c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 804.351056] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 804.351316] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 804.351474] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 804.351655] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 804.351801] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 804.351946] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 804.352283] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 804.352443] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 804.352570] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 804.352741] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 804.353077] env[60024]: DEBUG nova.virt.hardware [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 804.354092] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2888034-1dd9-4719-832f-1691ebffe4a7 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.363378] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7b5ad52-19a6-477e-af70-3c11c1865d9f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.609039] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquiring lock "d55ee9a1-6921-4648-ace2-f2da13c3523e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 804.609516] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Lock "d55ee9a1-6921-4648-ace2-f2da13c3523e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 804.851300] env[60024]: DEBUG nova.network.neutron [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Successfully created port: d143b257-e041-431a-bee6-35881166ba75 {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 805.746616] env[60024]: DEBUG nova.network.neutron [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Successfully updated port: d143b257-e041-431a-bee6-35881166ba75 {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 805.758170] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquiring lock "refresh_cache-a925d5fc-6437-40bb-adf1-ea10c32dde2a" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 805.758331] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquired lock "refresh_cache-a925d5fc-6437-40bb-adf1-ea10c32dde2a" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 805.758503] env[60024]: DEBUG nova.network.neutron [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 805.792314] env[60024]: DEBUG nova.compute.manager [req-6029cfe6-1955-4583-82f5-2e4f73baacd3 req-b7c47b5b-cc9e-4d96-9263-b2b0042c6992 service nova] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Received event network-vif-plugged-d143b257-e041-431a-bee6-35881166ba75 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 805.792314] env[60024]: DEBUG oslo_concurrency.lockutils [req-6029cfe6-1955-4583-82f5-2e4f73baacd3 req-b7c47b5b-cc9e-4d96-9263-b2b0042c6992 service nova] Acquiring lock "a925d5fc-6437-40bb-adf1-ea10c32dde2a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 805.792314] env[60024]: DEBUG oslo_concurrency.lockutils [req-6029cfe6-1955-4583-82f5-2e4f73baacd3 req-b7c47b5b-cc9e-4d96-9263-b2b0042c6992 service nova] Lock "a925d5fc-6437-40bb-adf1-ea10c32dde2a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 805.792314] env[60024]: DEBUG oslo_concurrency.lockutils [req-6029cfe6-1955-4583-82f5-2e4f73baacd3 req-b7c47b5b-cc9e-4d96-9263-b2b0042c6992 service nova] Lock "a925d5fc-6437-40bb-adf1-ea10c32dde2a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 805.792314] env[60024]: DEBUG nova.compute.manager [req-6029cfe6-1955-4583-82f5-2e4f73baacd3 req-b7c47b5b-cc9e-4d96-9263-b2b0042c6992 service nova] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] No waiting events found dispatching network-vif-plugged-d143b257-e041-431a-bee6-35881166ba75 {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 805.792314] env[60024]: WARNING nova.compute.manager [req-6029cfe6-1955-4583-82f5-2e4f73baacd3 req-b7c47b5b-cc9e-4d96-9263-b2b0042c6992 service nova] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Received unexpected event network-vif-plugged-d143b257-e041-431a-bee6-35881166ba75 for instance with vm_state building and task_state spawning. [ 805.792314] env[60024]: DEBUG nova.compute.manager [req-6029cfe6-1955-4583-82f5-2e4f73baacd3 req-b7c47b5b-cc9e-4d96-9263-b2b0042c6992 service nova] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Received event network-changed-d143b257-e041-431a-bee6-35881166ba75 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 805.792314] env[60024]: DEBUG nova.compute.manager [req-6029cfe6-1955-4583-82f5-2e4f73baacd3 req-b7c47b5b-cc9e-4d96-9263-b2b0042c6992 service nova] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Refreshing instance network info cache due to event network-changed-d143b257-e041-431a-bee6-35881166ba75. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 805.793151] env[60024]: DEBUG oslo_concurrency.lockutils [req-6029cfe6-1955-4583-82f5-2e4f73baacd3 req-b7c47b5b-cc9e-4d96-9263-b2b0042c6992 service nova] Acquiring lock "refresh_cache-a925d5fc-6437-40bb-adf1-ea10c32dde2a" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 805.824871] env[60024]: DEBUG nova.network.neutron [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 806.311236] env[60024]: DEBUG nova.network.neutron [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Updating instance_info_cache with network_info: [{"id": "d143b257-e041-431a-bee6-35881166ba75", "address": "fa:16:3e:d4:b0:b1", "network": {"id": "558ef47b-e753-425f-8df2-741c798b60a9", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-860543556-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6c6cba030fa2464f98c773682138ae9c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd143b257-e0", "ovs_interfaceid": "d143b257-e041-431a-bee6-35881166ba75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 806.324614] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Releasing lock "refresh_cache-a925d5fc-6437-40bb-adf1-ea10c32dde2a" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 806.324929] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Instance network_info: |[{"id": "d143b257-e041-431a-bee6-35881166ba75", "address": "fa:16:3e:d4:b0:b1", "network": {"id": "558ef47b-e753-425f-8df2-741c798b60a9", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-860543556-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6c6cba030fa2464f98c773682138ae9c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd143b257-e0", "ovs_interfaceid": "d143b257-e041-431a-bee6-35881166ba75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 806.325361] env[60024]: DEBUG oslo_concurrency.lockutils [req-6029cfe6-1955-4583-82f5-2e4f73baacd3 req-b7c47b5b-cc9e-4d96-9263-b2b0042c6992 service nova] Acquired lock "refresh_cache-a925d5fc-6437-40bb-adf1-ea10c32dde2a" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 806.325553] env[60024]: DEBUG nova.network.neutron [req-6029cfe6-1955-4583-82f5-2e4f73baacd3 req-b7c47b5b-cc9e-4d96-9263-b2b0042c6992 service nova] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Refreshing network info cache for port d143b257-e041-431a-bee6-35881166ba75 {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 806.330019] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d4:b0:b1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ff1f3320-df8e-49df-a412-9797a23bd173', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd143b257-e041-431a-bee6-35881166ba75', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 806.336207] env[60024]: DEBUG oslo.service.loopingcall [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 806.337163] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 806.340374] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0b83f0e4-1c62-431f-bbc4-96b3c7729277 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 806.362084] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 806.362084] env[60024]: value = "task-4576266" [ 806.362084] env[60024]: _type = "Task" [ 806.362084] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 806.371296] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576266, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 806.840918] env[60024]: DEBUG nova.network.neutron [req-6029cfe6-1955-4583-82f5-2e4f73baacd3 req-b7c47b5b-cc9e-4d96-9263-b2b0042c6992 service nova] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Updated VIF entry in instance network info cache for port d143b257-e041-431a-bee6-35881166ba75. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 806.841302] env[60024]: DEBUG nova.network.neutron [req-6029cfe6-1955-4583-82f5-2e4f73baacd3 req-b7c47b5b-cc9e-4d96-9263-b2b0042c6992 service nova] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Updating instance_info_cache with network_info: [{"id": "d143b257-e041-431a-bee6-35881166ba75", "address": "fa:16:3e:d4:b0:b1", "network": {"id": "558ef47b-e753-425f-8df2-741c798b60a9", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-860543556-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6c6cba030fa2464f98c773682138ae9c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd143b257-e0", "ovs_interfaceid": "d143b257-e041-431a-bee6-35881166ba75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 806.852497] env[60024]: DEBUG oslo_concurrency.lockutils [req-6029cfe6-1955-4583-82f5-2e4f73baacd3 req-b7c47b5b-cc9e-4d96-9263-b2b0042c6992 service nova] Releasing lock "refresh_cache-a925d5fc-6437-40bb-adf1-ea10c32dde2a" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 806.872458] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576266, 'name': CreateVM_Task, 'duration_secs': 0.330233} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 806.872643] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 806.873334] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 806.873505] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 806.873821] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 806.874075] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5f37db58-a30d-486d-86b4-5cfe21a7799c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 806.879623] env[60024]: DEBUG oslo_vmware.api [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Waiting for the task: (returnval){ [ 806.879623] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]52fc88ad-10e1-7d2c-ac34-d8c6de0fdaf5" [ 806.879623] env[60024]: _type = "Task" [ 806.879623] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 806.889844] env[60024]: DEBUG oslo_vmware.api [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]52fc88ad-10e1-7d2c-ac34-d8c6de0fdaf5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 807.390344] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 807.390611] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 807.390804] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 815.342519] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 815.342820] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Cleaning up deleted instances {{(pid=60024) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 815.354662] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] There are 0 instances to clean {{(pid=60024) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 815.354945] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 815.355151] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Cleaning up deleted instances with incomplete migration {{(pid=60024) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 815.364757] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 816.368809] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 817.343039] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 818.341833] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 818.342219] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 818.351203] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 818.351422] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 818.351600] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 818.351760] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60024) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 818.352881] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-819bac0f-2c17-4b53-8c8f-d3f467ace2d5 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 818.363520] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-697f623b-b558-4add-b7b1-376559f58fe2 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 818.382068] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0dad8e6b-a864-41ce-a153-a3ff4b96faa5 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 818.389167] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61734315-8be6-4b94-9f00-cc61c922b9cd {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 818.419056] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180703MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60024) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 818.419227] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 818.419434] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 818.539579] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 37916d26-1b5e-4991-83a2-ca5a5b00c2ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 818.539745] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 363f5261-d589-4f99-b7dd-ab8f16cefee3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 818.539887] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance ce222a29-3611-45b3-9664-87ae2fb1b1b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 818.540069] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 9214a18f-c22d-4e24-980e-7241a2b993bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 818.540157] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 3ab1b905-cd6f-4d2b-a244-f85e56f796d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 818.540279] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 036d6de2-f69b-4714-b89e-9c4307253675 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 818.540479] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 5888cc9f-7341-4f9c-a93c-dd5ec95f7369 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 818.540602] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 08e2d758-9005-4822-b157-84710b9c5ed4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 818.540718] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 54919bf0-b9f3-4bfc-ba1a-c6a52013e351 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 818.540832] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance a925d5fc-6437-40bb-adf1-ea10c32dde2a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 818.552578] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 076c3dd5-9043-456d-af24-0d2273321085 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 818.564071] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 1eaf8e02-bfb0-4928-9687-cc781a84d16d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 818.574265] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance c841298b-f103-4dc7-8884-efdf2ebc20a6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 818.586250] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance bf70d23b-4ab5-476e-814c-264b6a9f2455 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 818.596174] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 67ff7d52-6e30-4730-9b5a-9ae32f68b953 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 818.608024] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance bd91e947-acae-4dbd-b48b-5a6727eb4cbb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 818.619461] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 95358801-c9d8-4582-a712-36a8bf586456 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 818.631187] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance be6a4290-dbb3-4e1f-bdd4-0dc106db9435 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 818.641591] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance fcf47169-eb7a-4644-bf3f-7150c44c247f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 818.652309] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance d55ee9a1-6921-4648-ace2-f2da13c3523e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 818.652411] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 818.652565] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=100GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 818.905087] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24b0e3a9-733d-4db1-96f3-9ebb3829a435 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 818.913474] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f2c92f6-56b5-4d40-b918-8a61b872f5e2 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 818.946160] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b42a3fe-cafe-4e11-a09a-295b3ff95a08 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 818.955731] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e36aee75-25dc-443d-9fde-6da2c0c7da0c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 818.971053] env[60024]: DEBUG nova.compute.provider_tree [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 818.979700] env[60024]: DEBUG nova.scheduler.client.report [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 818.996176] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60024) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 818.996359] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.577s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 819.996231] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 819.996544] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Starting heal instance info cache {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 819.996544] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Rebuilding the list of instances to heal {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 820.018885] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 820.019173] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 820.019265] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 820.019426] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 820.019580] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 820.019746] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 820.019896] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 820.020052] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 820.020180] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 820.020332] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 820.020484] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Didn't find any instances for network info cache update. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 820.021015] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 820.021258] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 820.361174] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 821.341758] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 821.342094] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60024) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 848.461916] env[60024]: WARNING oslo_vmware.rw_handles [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 848.461916] env[60024]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 848.461916] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 848.461916] env[60024]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 848.461916] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 848.461916] env[60024]: ERROR oslo_vmware.rw_handles response.begin() [ 848.461916] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 848.461916] env[60024]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 848.461916] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 848.461916] env[60024]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 848.461916] env[60024]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 848.461916] env[60024]: ERROR oslo_vmware.rw_handles [ 848.461916] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Downloaded image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to vmware_temp/5552e8dc-95d7-4584-984f-30243e474d6e/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 848.463698] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Caching image {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 848.463955] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Copying Virtual Disk [datastore2] vmware_temp/5552e8dc-95d7-4584-984f-30243e474d6e/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk to [datastore2] vmware_temp/5552e8dc-95d7-4584-984f-30243e474d6e/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk {{(pid=60024) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 848.464248] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-07514f2f-007d-4d0b-b790-3bd3cd1da0d6 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.473031] env[60024]: DEBUG oslo_vmware.api [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Waiting for the task: (returnval){ [ 848.473031] env[60024]: value = "task-4576267" [ 848.473031] env[60024]: _type = "Task" [ 848.473031] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 848.481952] env[60024]: DEBUG oslo_vmware.api [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Task: {'id': task-4576267, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 848.984800] env[60024]: DEBUG oslo_vmware.exceptions [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Fault InvalidArgument not matched. {{(pid=60024) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 848.985125] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 848.985719] env[60024]: ERROR nova.compute.manager [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 848.985719] env[60024]: Faults: ['InvalidArgument'] [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Traceback (most recent call last): [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] yield resources [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] self.driver.spawn(context, instance, image_meta, [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] self._fetch_image_if_missing(context, vi) [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] image_cache(vi, tmp_image_ds_loc) [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] vm_util.copy_virtual_disk( [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] session._wait_for_task(vmdk_copy_task) [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] return self.wait_for_task(task_ref) [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] return evt.wait() [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] result = hub.switch() [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] return self.greenlet.switch() [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] self.f(*self.args, **self.kw) [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] raise exceptions.translate_fault(task_info.error) [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Faults: ['InvalidArgument'] [ 848.985719] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] [ 848.987358] env[60024]: INFO nova.compute.manager [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Terminating instance [ 848.987785] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 848.987993] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 848.988301] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f48510d9-e4cc-4809-bb61-20d10f9b247a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.990676] env[60024]: DEBUG nova.compute.manager [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 848.990857] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 848.991609] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e942b236-8446-41fe-a82e-aadc54780509 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 848.998951] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 848.999193] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-205a85e0-965d-46de-8116-cf107e36944f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.001697] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 849.001870] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 849.002532] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-139ece45-085a-44be-b710-2353f29db416 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.008888] env[60024]: DEBUG oslo_vmware.api [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Waiting for the task: (returnval){ [ 849.008888] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]523f1907-4190-6767-4e14-79f455f513a6" [ 849.008888] env[60024]: _type = "Task" [ 849.008888] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 849.017167] env[60024]: DEBUG oslo_vmware.api [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]523f1907-4190-6767-4e14-79f455f513a6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 849.075030] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 849.075275] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 849.075457] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Deleting the datastore file [datastore2] 363f5261-d589-4f99-b7dd-ab8f16cefee3 {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 849.075729] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-47097299-2836-469b-9b98-4116a4a03163 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.082833] env[60024]: DEBUG oslo_vmware.api [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Waiting for the task: (returnval){ [ 849.082833] env[60024]: value = "task-4576269" [ 849.082833] env[60024]: _type = "Task" [ 849.082833] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 849.091448] env[60024]: DEBUG oslo_vmware.api [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Task: {'id': task-4576269, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 849.521014] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 849.521014] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Creating directory with path [datastore2] vmware_temp/3283a49f-b058-4c8d-a478-a2c57910f694/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 849.521014] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5ebf51ad-a3ae-4377-b1e8-bbf1347f9b73 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.533382] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Created directory with path [datastore2] vmware_temp/3283a49f-b058-4c8d-a478-a2c57910f694/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 849.533686] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Fetch image to [datastore2] vmware_temp/3283a49f-b058-4c8d-a478-a2c57910f694/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 849.533851] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/3283a49f-b058-4c8d-a478-a2c57910f694/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 849.534543] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52c783e2-1936-44fb-920d-d91a59f16bc5 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.541945] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8c31908-84f0-4829-b459-a4adcb83e97c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.552102] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42d2dc59-99b9-4ec0-a322-6e1677969bbd {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.587596] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbf127e2-1e77-4b8c-a66d-bd5a0921ccf3 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.595344] env[60024]: DEBUG oslo_vmware.api [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Task: {'id': task-4576269, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081174} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 849.596837] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 849.597044] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 849.597221] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 849.597395] env[60024]: INFO nova.compute.manager [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Took 0.61 seconds to destroy the instance on the hypervisor. [ 849.599179] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b9cf5cbe-62ca-4e1c-a39d-3e5538d7efa5 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.601171] env[60024]: DEBUG nova.compute.claims [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 849.601344] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 849.601551] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 849.636524] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 849.656169] env[60024]: DEBUG nova.scheduler.client.report [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Refreshing inventories for resource provider 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 849.670290] env[60024]: DEBUG nova.scheduler.client.report [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Updating ProviderTree inventory for provider 5b70561f-4086-4d22-a0b6-aa1035435329 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 849.670569] env[60024]: DEBUG nova.compute.provider_tree [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Updating inventory in ProviderTree for provider 5b70561f-4086-4d22-a0b6-aa1035435329 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 849.682314] env[60024]: DEBUG nova.scheduler.client.report [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Refreshing aggregate associations for resource provider 5b70561f-4086-4d22-a0b6-aa1035435329, aggregates: None {{(pid=60024) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 849.686065] env[60024]: DEBUG oslo_vmware.rw_handles [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3283a49f-b058-4c8d-a478-a2c57910f694/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 849.739725] env[60024]: DEBUG nova.scheduler.client.report [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Refreshing trait associations for resource provider 5b70561f-4086-4d22-a0b6-aa1035435329, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE {{(pid=60024) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 849.743995] env[60024]: DEBUG oslo_vmware.rw_handles [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Completed reading data from the image iterator. {{(pid=60024) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 849.744079] env[60024]: DEBUG oslo_vmware.rw_handles [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3283a49f-b058-4c8d-a478-a2c57910f694/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 849.987936] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-660a60f1-b642-4665-ab40-ba208e43b372 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 849.995965] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1feb13c5-1e7c-4065-87d0-fe76bdca7095 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.031986] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4f34192-a530-4b4d-9e8f-014f1a8dcf04 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.040252] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb8125e5-2a3f-48d3-9810-0eb218a90294 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.055324] env[60024]: DEBUG nova.compute.provider_tree [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 850.063590] env[60024]: DEBUG nova.scheduler.client.report [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 850.078664] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.477s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 850.079251] env[60024]: ERROR nova.compute.manager [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 850.079251] env[60024]: Faults: ['InvalidArgument'] [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Traceback (most recent call last): [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] self.driver.spawn(context, instance, image_meta, [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] self._fetch_image_if_missing(context, vi) [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] image_cache(vi, tmp_image_ds_loc) [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] vm_util.copy_virtual_disk( [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] session._wait_for_task(vmdk_copy_task) [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] return self.wait_for_task(task_ref) [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] return evt.wait() [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] result = hub.switch() [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] return self.greenlet.switch() [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] self.f(*self.args, **self.kw) [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] raise exceptions.translate_fault(task_info.error) [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Faults: ['InvalidArgument'] [ 850.079251] env[60024]: ERROR nova.compute.manager [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] [ 850.080194] env[60024]: DEBUG nova.compute.utils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] VimFaultException {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 850.081537] env[60024]: DEBUG nova.compute.manager [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Build of instance 363f5261-d589-4f99-b7dd-ab8f16cefee3 was re-scheduled: A specified parameter was not correct: fileType [ 850.081537] env[60024]: Faults: ['InvalidArgument'] {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 850.081901] env[60024]: DEBUG nova.compute.manager [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 850.082091] env[60024]: DEBUG nova.compute.manager [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 850.082248] env[60024]: DEBUG nova.compute.manager [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 850.082401] env[60024]: DEBUG nova.network.neutron [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 850.755175] env[60024]: DEBUG nova.network.neutron [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 850.766593] env[60024]: INFO nova.compute.manager [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Took 0.68 seconds to deallocate network for instance. [ 850.861601] env[60024]: INFO nova.scheduler.client.report [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Deleted allocations for instance 363f5261-d589-4f99-b7dd-ab8f16cefee3 [ 850.888023] env[60024]: DEBUG oslo_concurrency.lockutils [None req-fac122e9-f2b2-46f3-ac71-9190de3e53b3 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Lock "363f5261-d589-4f99-b7dd-ab8f16cefee3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 276.970s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 850.888023] env[60024]: DEBUG oslo_concurrency.lockutils [None req-d6fa5508-bfde-475d-b929-f7cafd4464d0 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Lock "363f5261-d589-4f99-b7dd-ab8f16cefee3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 78.718s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 850.888023] env[60024]: DEBUG oslo_concurrency.lockutils [None req-d6fa5508-bfde-475d-b929-f7cafd4464d0 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Acquiring lock "363f5261-d589-4f99-b7dd-ab8f16cefee3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 850.888023] env[60024]: DEBUG oslo_concurrency.lockutils [None req-d6fa5508-bfde-475d-b929-f7cafd4464d0 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Lock "363f5261-d589-4f99-b7dd-ab8f16cefee3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 850.888023] env[60024]: DEBUG oslo_concurrency.lockutils [None req-d6fa5508-bfde-475d-b929-f7cafd4464d0 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Lock "363f5261-d589-4f99-b7dd-ab8f16cefee3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 850.890760] env[60024]: INFO nova.compute.manager [None req-d6fa5508-bfde-475d-b929-f7cafd4464d0 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Terminating instance [ 850.892840] env[60024]: DEBUG nova.compute.manager [None req-d6fa5508-bfde-475d-b929-f7cafd4464d0 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 850.893293] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-d6fa5508-bfde-475d-b929-f7cafd4464d0 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 850.893894] env[60024]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8650674e-894b-4143-ba23-79889165c02c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.904349] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e03b45d9-701c-4321-9ae4-5fb83f18ccd4 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 850.916689] env[60024]: DEBUG nova.compute.manager [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 850.940893] env[60024]: WARNING nova.virt.vmwareapi.vmops [None req-d6fa5508-bfde-475d-b929-f7cafd4464d0 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 363f5261-d589-4f99-b7dd-ab8f16cefee3 could not be found. [ 850.941164] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-d6fa5508-bfde-475d-b929-f7cafd4464d0 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 850.941307] env[60024]: INFO nova.compute.manager [None req-d6fa5508-bfde-475d-b929-f7cafd4464d0 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Took 0.05 seconds to destroy the instance on the hypervisor. [ 850.941601] env[60024]: DEBUG oslo.service.loopingcall [None req-d6fa5508-bfde-475d-b929-f7cafd4464d0 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 850.941842] env[60024]: DEBUG nova.compute.manager [-] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 850.941943] env[60024]: DEBUG nova.network.neutron [-] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 850.977964] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 850.978272] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 850.980047] env[60024]: INFO nova.compute.claims [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 850.990858] env[60024]: DEBUG nova.network.neutron [-] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 851.022379] env[60024]: INFO nova.compute.manager [-] [instance: 363f5261-d589-4f99-b7dd-ab8f16cefee3] Took 0.08 seconds to deallocate network for instance. [ 851.034455] env[60024]: DEBUG nova.compute.manager [req-53255d14-3bce-455f-908a-b6bfb4ea0796 req-238af534-140b-4127-aee9-81f32f55e808 service nova] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Received event network-vif-deleted-aa044565-88e5-4dc1-a4ad-2a709936cbba {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 851.161537] env[60024]: DEBUG oslo_concurrency.lockutils [None req-d6fa5508-bfde-475d-b929-f7cafd4464d0 tempest-ServerExternalEventsTest-81672727 tempest-ServerExternalEventsTest-81672727-project-member] Lock "363f5261-d589-4f99-b7dd-ab8f16cefee3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.274s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 851.312938] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a5ab23d-a58c-4cb2-a040-01a711a54f50 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.321623] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-327fab41-6839-41fa-91ab-c6c4b12ac42c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.352365] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e6c91d7-6c93-4308-978e-b2424be01767 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.361016] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1961190-58f4-43fa-af3e-3c1830667fb4 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.375215] env[60024]: DEBUG nova.compute.provider_tree [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 851.383782] env[60024]: DEBUG nova.scheduler.client.report [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 851.400167] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.422s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 851.400744] env[60024]: DEBUG nova.compute.manager [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 851.438499] env[60024]: DEBUG nova.compute.utils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 851.442797] env[60024]: DEBUG nova.compute.manager [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 851.442797] env[60024]: DEBUG nova.network.neutron [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 851.453940] env[60024]: DEBUG nova.compute.manager [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 851.510432] env[60024]: DEBUG nova.policy [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1cdb31acf24540e18e093a55808b2a84', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'af470ef5e36145e3bb547ba685209d97', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 851.527511] env[60024]: DEBUG nova.compute.manager [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 851.549359] env[60024]: DEBUG nova.virt.hardware [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 851.549639] env[60024]: DEBUG nova.virt.hardware [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 851.549801] env[60024]: DEBUG nova.virt.hardware [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 851.550036] env[60024]: DEBUG nova.virt.hardware [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 851.550189] env[60024]: DEBUG nova.virt.hardware [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 851.550342] env[60024]: DEBUG nova.virt.hardware [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 851.550560] env[60024]: DEBUG nova.virt.hardware [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 851.550725] env[60024]: DEBUG nova.virt.hardware [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 851.550893] env[60024]: DEBUG nova.virt.hardware [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 851.551069] env[60024]: DEBUG nova.virt.hardware [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 851.551256] env[60024]: DEBUG nova.virt.hardware [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 851.552182] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2de667cd-a562-49ad-ab27-e1568666bdf3 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 851.562572] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e62a165-38bf-48e5-9510-f95b9b536bd6 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 852.004644] env[60024]: DEBUG nova.network.neutron [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Successfully created port: 4f1d6c09-ea11-4153-abe8-bb758f2f52cc {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 853.509456] env[60024]: DEBUG nova.network.neutron [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Successfully updated port: 4f1d6c09-ea11-4153-abe8-bb758f2f52cc {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 853.520754] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Acquiring lock "refresh_cache-076c3dd5-9043-456d-af24-0d2273321085" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 853.520927] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Acquired lock "refresh_cache-076c3dd5-9043-456d-af24-0d2273321085" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 853.521097] env[60024]: DEBUG nova.network.neutron [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 853.531130] env[60024]: DEBUG nova.compute.utils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Can not refresh info_cache because instance was not found {{(pid=60024) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 853.621716] env[60024]: DEBUG nova.network.neutron [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 853.773628] env[60024]: DEBUG nova.compute.manager [req-084afeb2-8ebe-477e-9699-96fb4119cf7e req-bd50b3a8-c2ca-476f-975a-4065b701878f service nova] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Received event network-vif-deleted-cae1a3f6-5e7d-4356-ad0d-b269f2feb62a {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 853.773845] env[60024]: DEBUG nova.compute.manager [req-084afeb2-8ebe-477e-9699-96fb4119cf7e req-bd50b3a8-c2ca-476f-975a-4065b701878f service nova] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Received event network-vif-plugged-4f1d6c09-ea11-4153-abe8-bb758f2f52cc {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 853.774044] env[60024]: DEBUG oslo_concurrency.lockutils [req-084afeb2-8ebe-477e-9699-96fb4119cf7e req-bd50b3a8-c2ca-476f-975a-4065b701878f service nova] Acquiring lock "076c3dd5-9043-456d-af24-0d2273321085-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 853.774256] env[60024]: DEBUG oslo_concurrency.lockutils [req-084afeb2-8ebe-477e-9699-96fb4119cf7e req-bd50b3a8-c2ca-476f-975a-4065b701878f service nova] Lock "076c3dd5-9043-456d-af24-0d2273321085-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 853.774419] env[60024]: DEBUG oslo_concurrency.lockutils [req-084afeb2-8ebe-477e-9699-96fb4119cf7e req-bd50b3a8-c2ca-476f-975a-4065b701878f service nova] Lock "076c3dd5-9043-456d-af24-0d2273321085-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 853.774579] env[60024]: DEBUG nova.compute.manager [req-084afeb2-8ebe-477e-9699-96fb4119cf7e req-bd50b3a8-c2ca-476f-975a-4065b701878f service nova] [instance: 076c3dd5-9043-456d-af24-0d2273321085] No waiting events found dispatching network-vif-plugged-4f1d6c09-ea11-4153-abe8-bb758f2f52cc {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 853.774744] env[60024]: WARNING nova.compute.manager [req-084afeb2-8ebe-477e-9699-96fb4119cf7e req-bd50b3a8-c2ca-476f-975a-4065b701878f service nova] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Received unexpected event network-vif-plugged-4f1d6c09-ea11-4153-abe8-bb758f2f52cc for instance with vm_state deleted and task_state None. [ 854.101687] env[60024]: DEBUG nova.network.neutron [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Updating instance_info_cache with network_info: [{"id": "4f1d6c09-ea11-4153-abe8-bb758f2f52cc", "address": "fa:16:3e:0e:3e:6a", "network": {"id": "6624b4bc-d8be-4433-8685-128f1c7ab851", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-221393611-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "af470ef5e36145e3bb547ba685209d97", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4225eb1f-0af4-4ed4-8e3d-de822eb6d4ea", "external-id": "nsx-vlan-transportzone-40", "segmentation_id": 40, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4f1d6c09-ea", "ovs_interfaceid": "4f1d6c09-ea11-4153-abe8-bb758f2f52cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 854.116157] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Releasing lock "refresh_cache-076c3dd5-9043-456d-af24-0d2273321085" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 854.116522] env[60024]: DEBUG nova.compute.manager [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Instance network_info: |[{"id": "4f1d6c09-ea11-4153-abe8-bb758f2f52cc", "address": "fa:16:3e:0e:3e:6a", "network": {"id": "6624b4bc-d8be-4433-8685-128f1c7ab851", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-221393611-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "af470ef5e36145e3bb547ba685209d97", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4225eb1f-0af4-4ed4-8e3d-de822eb6d4ea", "external-id": "nsx-vlan-transportzone-40", "segmentation_id": 40, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4f1d6c09-ea", "ovs_interfaceid": "4f1d6c09-ea11-4153-abe8-bb758f2f52cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 854.117208] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0e:3e:6a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4225eb1f-0af4-4ed4-8e3d-de822eb6d4ea', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4f1d6c09-ea11-4153-abe8-bb758f2f52cc', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 854.126302] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Creating folder: Project (af470ef5e36145e3bb547ba685209d97). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 854.126868] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2ebb8663-9350-4672-b9bd-446a9110bde1 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 854.139906] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Created folder: Project (af470ef5e36145e3bb547ba685209d97) in parent group-v894073. [ 854.140137] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Creating folder: Instances. Parent ref: group-v894119. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 854.140371] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-56a80875-1f77-45c6-8e17-5ad83bb75f3e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 854.153158] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Created folder: Instances in parent group-v894119. [ 854.153638] env[60024]: DEBUG oslo.service.loopingcall [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 854.153819] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 854.153896] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-aea3dfa9-4d95-4071-9979-ee943ef2e730 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 854.177503] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 854.177503] env[60024]: value = "task-4576272" [ 854.177503] env[60024]: _type = "Task" [ 854.177503] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 854.186950] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576272, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 854.688917] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576272, 'name': CreateVM_Task, 'duration_secs': 0.344496} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 854.697020] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 854.697020] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 854.697020] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 854.697020] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 854.697020] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f5812777-50b4-433d-ac6b-5feccb38f129 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 854.700373] env[60024]: DEBUG oslo_vmware.api [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Waiting for the task: (returnval){ [ 854.700373] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]525c06fb-3873-db05-3d63-7093ddfa1cbf" [ 854.700373] env[60024]: _type = "Task" [ 854.700373] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 854.712187] env[60024]: DEBUG oslo_vmware.api [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]525c06fb-3873-db05-3d63-7093ddfa1cbf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 855.218107] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 855.219304] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 855.219304] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 855.811757] env[60024]: DEBUG nova.compute.manager [req-ef46b1ff-6519-42e2-90a3-69e1a70fab42 req-cb707ee7-789a-4fae-a696-6477f5430208 service nova] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Received event network-changed-4f1d6c09-ea11-4153-abe8-bb758f2f52cc {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 855.812085] env[60024]: DEBUG nova.compute.manager [req-ef46b1ff-6519-42e2-90a3-69e1a70fab42 req-cb707ee7-789a-4fae-a696-6477f5430208 service nova] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Refreshing instance network info cache due to event network-changed-4f1d6c09-ea11-4153-abe8-bb758f2f52cc. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 855.812340] env[60024]: DEBUG oslo_concurrency.lockutils [req-ef46b1ff-6519-42e2-90a3-69e1a70fab42 req-cb707ee7-789a-4fae-a696-6477f5430208 service nova] Acquiring lock "refresh_cache-076c3dd5-9043-456d-af24-0d2273321085" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 855.812453] env[60024]: DEBUG oslo_concurrency.lockutils [req-ef46b1ff-6519-42e2-90a3-69e1a70fab42 req-cb707ee7-789a-4fae-a696-6477f5430208 service nova] Acquired lock "refresh_cache-076c3dd5-9043-456d-af24-0d2273321085" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 855.813042] env[60024]: DEBUG nova.network.neutron [req-ef46b1ff-6519-42e2-90a3-69e1a70fab42 req-cb707ee7-789a-4fae-a696-6477f5430208 service nova] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Refreshing network info cache for port 4f1d6c09-ea11-4153-abe8-bb758f2f52cc {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 855.849101] env[60024]: DEBUG nova.network.neutron [req-ef46b1ff-6519-42e2-90a3-69e1a70fab42 req-cb707ee7-789a-4fae-a696-6477f5430208 service nova] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 856.093744] env[60024]: DEBUG nova.network.neutron [req-ef46b1ff-6519-42e2-90a3-69e1a70fab42 req-cb707ee7-789a-4fae-a696-6477f5430208 service nova] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Instance is deleted, no further info cache update {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:106}} [ 856.093744] env[60024]: DEBUG oslo_concurrency.lockutils [req-ef46b1ff-6519-42e2-90a3-69e1a70fab42 req-cb707ee7-789a-4fae-a696-6477f5430208 service nova] Releasing lock "refresh_cache-076c3dd5-9043-456d-af24-0d2273321085" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 856.094557] env[60024]: DEBUG nova.compute.manager [req-ef46b1ff-6519-42e2-90a3-69e1a70fab42 req-cb707ee7-789a-4fae-a696-6477f5430208 service nova] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Received event network-vif-deleted-d143b257-e041-431a-bee6-35881166ba75 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 875.337873] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 876.341561] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 878.342580] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 878.354297] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 878.354547] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 878.354716] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 878.354874] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60024) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 878.356258] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cbb1f78-920e-41bb-a7b6-4155d7131e02 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.368897] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb27841c-6a88-4f30-9a2a-675340745a54 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.385365] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14b0d27b-4b90-4f97-86e2-d391689c39cc {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.393574] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa3db33d-4e6d-40a9-a68f-758d546e3df5 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.425919] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180673MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60024) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 878.426145] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 878.426664] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 878.484629] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 37916d26-1b5e-4991-83a2-ca5a5b00c2ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 878.484892] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance ce222a29-3611-45b3-9664-87ae2fb1b1b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 878.485094] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 9214a18f-c22d-4e24-980e-7241a2b993bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 878.485309] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 3ab1b905-cd6f-4d2b-a244-f85e56f796d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 878.485490] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 036d6de2-f69b-4714-b89e-9c4307253675 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 878.485660] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 5888cc9f-7341-4f9c-a93c-dd5ec95f7369 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 878.499860] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance be6a4290-dbb3-4e1f-bdd4-0dc106db9435 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 878.512979] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance fcf47169-eb7a-4644-bf3f-7150c44c247f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 878.525139] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance d55ee9a1-6921-4648-ace2-f2da13c3523e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 878.525422] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 878.525611] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=100GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 878.672338] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c821c182-b62d-40b8-acb1-e025ce6af15f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.681799] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d2dd626-395d-4b3c-b1fb-d5285fc3d74a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.725969] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e159490f-17dd-4a21-8030-bcfee34078d1 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.734741] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e1230f8-429b-48c4-8bc0-2c7bb2d11288 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 878.756120] env[60024]: DEBUG nova.compute.provider_tree [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 878.768449] env[60024]: DEBUG nova.scheduler.client.report [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 878.794676] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60024) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 878.794676] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.367s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 879.793059] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 879.793356] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 880.342330] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 880.342518] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Starting heal instance info cache {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 880.342644] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Rebuilding the list of instances to heal {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 880.366845] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 880.366845] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 880.366845] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 880.366845] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 880.366845] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 880.366845] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 880.366845] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Didn't find any instances for network info cache update. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 880.366845] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 881.342427] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 881.342737] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 882.342256] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 882.342256] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60024) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 885.685450] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Acquiring lock "e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 885.685450] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Lock "e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 887.688973] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Acquiring lock "54ded864-1c3e-4a47-968f-ca597c82cb87" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 887.688973] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Lock "54ded864-1c3e-4a47-968f-ca597c82cb87" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 895.537115] env[60024]: WARNING oslo_vmware.rw_handles [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 895.537115] env[60024]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 895.537115] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 895.537115] env[60024]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 895.537115] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 895.537115] env[60024]: ERROR oslo_vmware.rw_handles response.begin() [ 895.537115] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 895.537115] env[60024]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 895.537115] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 895.537115] env[60024]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 895.537115] env[60024]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 895.537115] env[60024]: ERROR oslo_vmware.rw_handles [ 895.537863] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Downloaded image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to vmware_temp/3283a49f-b058-4c8d-a478-a2c57910f694/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 895.539445] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Caching image {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 895.539714] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Copying Virtual Disk [datastore2] vmware_temp/3283a49f-b058-4c8d-a478-a2c57910f694/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk to [datastore2] vmware_temp/3283a49f-b058-4c8d-a478-a2c57910f694/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk {{(pid=60024) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 895.539989] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-05ca6cad-70c6-4dd8-b8b1-a0a368fe8ead {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 895.548805] env[60024]: DEBUG oslo_vmware.api [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Waiting for the task: (returnval){ [ 895.548805] env[60024]: value = "task-4576282" [ 895.548805] env[60024]: _type = "Task" [ 895.548805] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 895.558338] env[60024]: DEBUG oslo_vmware.api [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Task: {'id': task-4576282, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 896.059681] env[60024]: DEBUG oslo_vmware.exceptions [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Fault InvalidArgument not matched. {{(pid=60024) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 896.059855] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 896.060291] env[60024]: ERROR nova.compute.manager [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 896.060291] env[60024]: Faults: ['InvalidArgument'] [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Traceback (most recent call last): [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] yield resources [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] self.driver.spawn(context, instance, image_meta, [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] self._fetch_image_if_missing(context, vi) [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] image_cache(vi, tmp_image_ds_loc) [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] vm_util.copy_virtual_disk( [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] session._wait_for_task(vmdk_copy_task) [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] return self.wait_for_task(task_ref) [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] return evt.wait() [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] result = hub.switch() [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] return self.greenlet.switch() [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] self.f(*self.args, **self.kw) [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] raise exceptions.translate_fault(task_info.error) [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Faults: ['InvalidArgument'] [ 896.060291] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] [ 896.061751] env[60024]: INFO nova.compute.manager [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Terminating instance [ 896.062337] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 896.062494] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 896.062610] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6fb5ff3f-8ed5-465b-b564-4e6e0e3dc28a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.065015] env[60024]: DEBUG nova.compute.manager [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 896.065210] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 896.065929] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4eaba85-6148-430a-a1a5-006bb23ae1da {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.073522] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 896.073753] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-72c6ae9d-4de6-44d6-b2fc-96a063a2baeb {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.076157] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 896.076333] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 896.077293] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-78e49455-b046-4fd9-88a6-77acec93bb87 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.082655] env[60024]: DEBUG oslo_vmware.api [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Waiting for the task: (returnval){ [ 896.082655] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]52e44750-8800-9881-1e0b-f9630fba5362" [ 896.082655] env[60024]: _type = "Task" [ 896.082655] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 896.090577] env[60024]: DEBUG oslo_vmware.api [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]52e44750-8800-9881-1e0b-f9630fba5362, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 896.143073] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 896.143073] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 896.143385] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Deleting the datastore file [datastore2] ce222a29-3611-45b3-9664-87ae2fb1b1b8 {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 896.143670] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-770a7540-a86d-4d87-b2b4-778dfd913904 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.150332] env[60024]: DEBUG oslo_vmware.api [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Waiting for the task: (returnval){ [ 896.150332] env[60024]: value = "task-4576284" [ 896.150332] env[60024]: _type = "Task" [ 896.150332] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 896.159201] env[60024]: DEBUG oslo_vmware.api [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Task: {'id': task-4576284, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 896.594247] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 896.594642] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Creating directory with path [datastore2] vmware_temp/1ce68d87-dd07-41fd-a9dd-f1eefec73bd8/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 896.594737] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bcc80b4a-da1b-4a15-a61e-4624c4233a8c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.607569] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Created directory with path [datastore2] vmware_temp/1ce68d87-dd07-41fd-a9dd-f1eefec73bd8/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 896.607778] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Fetch image to [datastore2] vmware_temp/1ce68d87-dd07-41fd-a9dd-f1eefec73bd8/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 896.607952] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/1ce68d87-dd07-41fd-a9dd-f1eefec73bd8/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 896.608724] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa0e810f-7853-4e23-900e-52928d0401c6 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.615991] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7003fc5f-49e0-4ec8-b026-c2fa4fef82c2 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.625784] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c464272f-73ab-47c6-bb37-8178aedc8d2e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.660908] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d2e3a52-82cb-4872-a65f-e1cef8120fe5 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.669028] env[60024]: DEBUG oslo_vmware.api [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Task: {'id': task-4576284, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069699} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 896.670556] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 896.670755] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 896.670924] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 896.671148] env[60024]: INFO nova.compute.manager [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Took 0.61 seconds to destroy the instance on the hypervisor. [ 896.672969] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0dee7400-64b6-46c3-9eef-5826dc2cf5e6 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.675134] env[60024]: DEBUG nova.compute.claims [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 896.675362] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 896.675694] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 896.700327] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 896.861729] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f9f5c7e-4232-4da3-8114-79c2e09843e9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.863645] env[60024]: DEBUG oslo_vmware.rw_handles [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1ce68d87-dd07-41fd-a9dd-f1eefec73bd8/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 896.920415] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d2ada01-d5ad-42d1-8d1f-6eaf484d4c8f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.926043] env[60024]: DEBUG oslo_vmware.rw_handles [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Completed reading data from the image iterator. {{(pid=60024) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 896.926185] env[60024]: DEBUG oslo_vmware.rw_handles [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1ce68d87-dd07-41fd-a9dd-f1eefec73bd8/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 896.954453] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-699c4e7c-66bf-45f3-873d-a7daa21413fc {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.962859] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d291bfc-7cb2-472a-be29-1bb11e110dee {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 896.976855] env[60024]: DEBUG nova.compute.provider_tree [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 896.985762] env[60024]: DEBUG nova.scheduler.client.report [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 897.001677] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.326s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 897.002281] env[60024]: ERROR nova.compute.manager [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 897.002281] env[60024]: Faults: ['InvalidArgument'] [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Traceback (most recent call last): [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] self.driver.spawn(context, instance, image_meta, [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] self._fetch_image_if_missing(context, vi) [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] image_cache(vi, tmp_image_ds_loc) [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] vm_util.copy_virtual_disk( [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] session._wait_for_task(vmdk_copy_task) [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] return self.wait_for_task(task_ref) [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] return evt.wait() [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] result = hub.switch() [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] return self.greenlet.switch() [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] self.f(*self.args, **self.kw) [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] raise exceptions.translate_fault(task_info.error) [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Faults: ['InvalidArgument'] [ 897.002281] env[60024]: ERROR nova.compute.manager [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] [ 897.003271] env[60024]: DEBUG nova.compute.utils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] VimFaultException {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 897.005665] env[60024]: DEBUG nova.compute.manager [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Build of instance ce222a29-3611-45b3-9664-87ae2fb1b1b8 was re-scheduled: A specified parameter was not correct: fileType [ 897.005665] env[60024]: Faults: ['InvalidArgument'] {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 897.006051] env[60024]: DEBUG nova.compute.manager [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 897.006234] env[60024]: DEBUG nova.compute.manager [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 897.006404] env[60024]: DEBUG nova.compute.manager [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 897.006564] env[60024]: DEBUG nova.network.neutron [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 897.477441] env[60024]: DEBUG nova.network.neutron [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 897.490183] env[60024]: INFO nova.compute.manager [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: ce222a29-3611-45b3-9664-87ae2fb1b1b8] Took 0.48 seconds to deallocate network for instance. [ 897.581188] env[60024]: INFO nova.scheduler.client.report [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Deleted allocations for instance ce222a29-3611-45b3-9664-87ae2fb1b1b8 [ 897.600255] env[60024]: DEBUG oslo_concurrency.lockutils [None req-3989da87-6fee-44a7-bc2b-45950fa18c6f tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Lock "ce222a29-3611-45b3-9664-87ae2fb1b1b8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 321.370s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 897.617989] env[60024]: DEBUG nova.compute.manager [None req-5269542d-1bd0-4a60-91e5-e702adcd1908 tempest-AttachInterfacesTestJSON-62544092 tempest-AttachInterfacesTestJSON-62544092-project-member] [instance: 1eaf8e02-bfb0-4928-9687-cc781a84d16d] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 897.646363] env[60024]: DEBUG nova.compute.manager [None req-5269542d-1bd0-4a60-91e5-e702adcd1908 tempest-AttachInterfacesTestJSON-62544092 tempest-AttachInterfacesTestJSON-62544092-project-member] [instance: 1eaf8e02-bfb0-4928-9687-cc781a84d16d] Instance disappeared before build. {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 897.667864] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5269542d-1bd0-4a60-91e5-e702adcd1908 tempest-AttachInterfacesTestJSON-62544092 tempest-AttachInterfacesTestJSON-62544092-project-member] Lock "1eaf8e02-bfb0-4928-9687-cc781a84d16d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 236.126s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 897.677543] env[60024]: DEBUG nova.compute.manager [None req-14455799-885d-47a6-b31a-d48adadc4279 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: c841298b-f103-4dc7-8884-efdf2ebc20a6] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 897.721024] env[60024]: DEBUG nova.compute.manager [None req-14455799-885d-47a6-b31a-d48adadc4279 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] [instance: c841298b-f103-4dc7-8884-efdf2ebc20a6] Instance disappeared before build. {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 897.742805] env[60024]: DEBUG oslo_concurrency.lockutils [None req-14455799-885d-47a6-b31a-d48adadc4279 tempest-ImagesTestJSON-1812607055 tempest-ImagesTestJSON-1812607055-project-member] Lock "c841298b-f103-4dc7-8884-efdf2ebc20a6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 233.835s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 897.751555] env[60024]: DEBUG nova.compute.manager [None req-2a575c2b-5102-4501-a724-cecb4fb9a882 tempest-ServersTestJSON-1695110591 tempest-ServersTestJSON-1695110591-project-member] [instance: bf70d23b-4ab5-476e-814c-264b6a9f2455] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 897.777568] env[60024]: DEBUG nova.compute.manager [None req-2a575c2b-5102-4501-a724-cecb4fb9a882 tempest-ServersTestJSON-1695110591 tempest-ServersTestJSON-1695110591-project-member] [instance: bf70d23b-4ab5-476e-814c-264b6a9f2455] Instance disappeared before build. {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 897.800389] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2a575c2b-5102-4501-a724-cecb4fb9a882 tempest-ServersTestJSON-1695110591 tempest-ServersTestJSON-1695110591-project-member] Lock "bf70d23b-4ab5-476e-814c-264b6a9f2455" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 231.289s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 897.810121] env[60024]: DEBUG nova.compute.manager [None req-b87ef477-9e92-4961-845a-6acd6bee3a06 tempest-ServersNegativeTestJSON-1972359588 tempest-ServersNegativeTestJSON-1972359588-project-member] [instance: 67ff7d52-6e30-4730-9b5a-9ae32f68b953] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 897.835941] env[60024]: DEBUG nova.compute.manager [None req-b87ef477-9e92-4961-845a-6acd6bee3a06 tempest-ServersNegativeTestJSON-1972359588 tempest-ServersNegativeTestJSON-1972359588-project-member] [instance: 67ff7d52-6e30-4730-9b5a-9ae32f68b953] Instance disappeared before build. {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 897.856303] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b87ef477-9e92-4961-845a-6acd6bee3a06 tempest-ServersNegativeTestJSON-1972359588 tempest-ServersNegativeTestJSON-1972359588-project-member] Lock "67ff7d52-6e30-4730-9b5a-9ae32f68b953" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 228.440s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 897.865528] env[60024]: DEBUG nova.compute.manager [None req-ba27ce61-be1b-476b-9184-384249a73295 tempest-SecurityGroupsTestJSON-2088684547 tempest-SecurityGroupsTestJSON-2088684547-project-member] [instance: bd91e947-acae-4dbd-b48b-5a6727eb4cbb] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 897.888568] env[60024]: DEBUG nova.compute.manager [None req-ba27ce61-be1b-476b-9184-384249a73295 tempest-SecurityGroupsTestJSON-2088684547 tempest-SecurityGroupsTestJSON-2088684547-project-member] [instance: bd91e947-acae-4dbd-b48b-5a6727eb4cbb] Instance disappeared before build. {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 897.908904] env[60024]: DEBUG oslo_concurrency.lockutils [None req-ba27ce61-be1b-476b-9184-384249a73295 tempest-SecurityGroupsTestJSON-2088684547 tempest-SecurityGroupsTestJSON-2088684547-project-member] Lock "bd91e947-acae-4dbd-b48b-5a6727eb4cbb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 228.277s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 897.918375] env[60024]: DEBUG nova.compute.manager [None req-798944be-b318-4a6d-a582-3984731e10fc tempest-ServerActionsTestOtherA-202760684 tempest-ServerActionsTestOtherA-202760684-project-member] [instance: 95358801-c9d8-4582-a712-36a8bf586456] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 897.942056] env[60024]: DEBUG nova.compute.manager [None req-798944be-b318-4a6d-a582-3984731e10fc tempest-ServerActionsTestOtherA-202760684 tempest-ServerActionsTestOtherA-202760684-project-member] [instance: 95358801-c9d8-4582-a712-36a8bf586456] Instance disappeared before build. {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 897.964084] env[60024]: DEBUG oslo_concurrency.lockutils [None req-798944be-b318-4a6d-a582-3984731e10fc tempest-ServerActionsTestOtherA-202760684 tempest-ServerActionsTestOtherA-202760684-project-member] Lock "95358801-c9d8-4582-a712-36a8bf586456" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 220.523s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 897.973805] env[60024]: DEBUG nova.compute.manager [None req-814fad23-a147-41ec-ba07-f0ec4ac3c42f tempest-ServerActionsV293TestJSON-332701173 tempest-ServerActionsV293TestJSON-332701173-project-member] [instance: be6a4290-dbb3-4e1f-bdd4-0dc106db9435] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 897.997427] env[60024]: DEBUG nova.compute.manager [None req-814fad23-a147-41ec-ba07-f0ec4ac3c42f tempest-ServerActionsV293TestJSON-332701173 tempest-ServerActionsV293TestJSON-332701173-project-member] [instance: be6a4290-dbb3-4e1f-bdd4-0dc106db9435] Instance disappeared before build. {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 898.018342] env[60024]: DEBUG oslo_concurrency.lockutils [None req-814fad23-a147-41ec-ba07-f0ec4ac3c42f tempest-ServerActionsV293TestJSON-332701173 tempest-ServerActionsV293TestJSON-332701173-project-member] Lock "be6a4290-dbb3-4e1f-bdd4-0dc106db9435" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.531s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 898.027324] env[60024]: DEBUG nova.compute.manager [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 898.075331] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 898.075608] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 898.077141] env[60024]: INFO nova.compute.claims [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 898.272030] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6de0911d-3497-4ea1-aa91-d1a19005fdfa {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.280768] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c5842c9-9729-4681-af78-8f3872d1b19e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.313352] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fdc2237-dbdc-4acd-80c2-4b06a26a831c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.321866] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2392d171-f982-4d7c-ad4f-102043f5ba86 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.337298] env[60024]: DEBUG nova.compute.provider_tree [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 898.346886] env[60024]: DEBUG nova.scheduler.client.report [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 898.366526] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.291s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 898.367038] env[60024]: DEBUG nova.compute.manager [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 898.401023] env[60024]: DEBUG nova.compute.utils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 898.402517] env[60024]: DEBUG nova.compute.manager [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 898.402769] env[60024]: DEBUG nova.network.neutron [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 898.415058] env[60024]: DEBUG nova.compute.manager [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 898.479598] env[60024]: DEBUG nova.compute.manager [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 898.501554] env[60024]: DEBUG nova.virt.hardware [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 898.501746] env[60024]: DEBUG nova.virt.hardware [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 898.501935] env[60024]: DEBUG nova.virt.hardware [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 898.502102] env[60024]: DEBUG nova.virt.hardware [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 898.502249] env[60024]: DEBUG nova.virt.hardware [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 898.502395] env[60024]: DEBUG nova.virt.hardware [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 898.502598] env[60024]: DEBUG nova.virt.hardware [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 898.502758] env[60024]: DEBUG nova.virt.hardware [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 898.502921] env[60024]: DEBUG nova.virt.hardware [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 898.503301] env[60024]: DEBUG nova.virt.hardware [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 898.503516] env[60024]: DEBUG nova.virt.hardware [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 898.504674] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-188675f1-f717-4e90-ad1b-5c8e39ef50eb {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.513205] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94855ee8-111d-44f3-a778-88964227fcb4 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 898.518595] env[60024]: DEBUG nova.policy [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e47f8eca093a4d5a82614eebe2f3a214', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bd0fcd8949d44cad8e75a3878331c428', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 898.825094] env[60024]: DEBUG nova.network.neutron [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Successfully created port: 00cb7570-258d-4c23-bcaf-72891c8b2671 {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 899.738229] env[60024]: DEBUG nova.compute.manager [req-37254916-d12c-46c4-bcc3-c7f5ad2977e9 req-f1a5ad5f-e0e4-42fa-a35b-76c95c6578d0 service nova] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Received event network-vif-plugged-00cb7570-258d-4c23-bcaf-72891c8b2671 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 899.738559] env[60024]: DEBUG oslo_concurrency.lockutils [req-37254916-d12c-46c4-bcc3-c7f5ad2977e9 req-f1a5ad5f-e0e4-42fa-a35b-76c95c6578d0 service nova] Acquiring lock "fcf47169-eb7a-4644-bf3f-7150c44c247f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 899.738690] env[60024]: DEBUG oslo_concurrency.lockutils [req-37254916-d12c-46c4-bcc3-c7f5ad2977e9 req-f1a5ad5f-e0e4-42fa-a35b-76c95c6578d0 service nova] Lock "fcf47169-eb7a-4644-bf3f-7150c44c247f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 899.738860] env[60024]: DEBUG oslo_concurrency.lockutils [req-37254916-d12c-46c4-bcc3-c7f5ad2977e9 req-f1a5ad5f-e0e4-42fa-a35b-76c95c6578d0 service nova] Lock "fcf47169-eb7a-4644-bf3f-7150c44c247f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 899.739175] env[60024]: DEBUG nova.compute.manager [req-37254916-d12c-46c4-bcc3-c7f5ad2977e9 req-f1a5ad5f-e0e4-42fa-a35b-76c95c6578d0 service nova] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] No waiting events found dispatching network-vif-plugged-00cb7570-258d-4c23-bcaf-72891c8b2671 {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 899.739398] env[60024]: WARNING nova.compute.manager [req-37254916-d12c-46c4-bcc3-c7f5ad2977e9 req-f1a5ad5f-e0e4-42fa-a35b-76c95c6578d0 service nova] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Received unexpected event network-vif-plugged-00cb7570-258d-4c23-bcaf-72891c8b2671 for instance with vm_state building and task_state spawning. [ 899.788284] env[60024]: DEBUG nova.network.neutron [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Successfully updated port: 00cb7570-258d-4c23-bcaf-72891c8b2671 {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 899.797094] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Acquiring lock "refresh_cache-fcf47169-eb7a-4644-bf3f-7150c44c247f" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 899.797280] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Acquired lock "refresh_cache-fcf47169-eb7a-4644-bf3f-7150c44c247f" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 899.797446] env[60024]: DEBUG nova.network.neutron [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 899.837343] env[60024]: DEBUG nova.network.neutron [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 900.012776] env[60024]: DEBUG nova.network.neutron [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Updating instance_info_cache with network_info: [{"id": "00cb7570-258d-4c23-bcaf-72891c8b2671", "address": "fa:16:3e:9b:09:fc", "network": {"id": "1969bbb1-a8df-47e6-93ff-f836a3cdec1f", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-285142802-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bd0fcd8949d44cad8e75a3878331c428", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d88bb07-f93c-45ca-bce7-230cb1f33833", "external-id": "nsx-vlan-transportzone-387", "segmentation_id": 387, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap00cb7570-25", "ovs_interfaceid": "00cb7570-258d-4c23-bcaf-72891c8b2671", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 900.025249] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Releasing lock "refresh_cache-fcf47169-eb7a-4644-bf3f-7150c44c247f" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 900.025558] env[60024]: DEBUG nova.compute.manager [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Instance network_info: |[{"id": "00cb7570-258d-4c23-bcaf-72891c8b2671", "address": "fa:16:3e:9b:09:fc", "network": {"id": "1969bbb1-a8df-47e6-93ff-f836a3cdec1f", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-285142802-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bd0fcd8949d44cad8e75a3878331c428", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d88bb07-f93c-45ca-bce7-230cb1f33833", "external-id": "nsx-vlan-transportzone-387", "segmentation_id": 387, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap00cb7570-25", "ovs_interfaceid": "00cb7570-258d-4c23-bcaf-72891c8b2671", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 900.025931] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9b:09:fc', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2d88bb07-f93c-45ca-bce7-230cb1f33833', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '00cb7570-258d-4c23-bcaf-72891c8b2671', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 900.033857] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Creating folder: Project (bd0fcd8949d44cad8e75a3878331c428). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 900.034521] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ae619fd4-b2ea-434a-a166-76f7fe3f7b49 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.047097] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Created folder: Project (bd0fcd8949d44cad8e75a3878331c428) in parent group-v894073. [ 900.047298] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Creating folder: Instances. Parent ref: group-v894126. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 900.047532] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fe0817a7-4357-4d41-a879-26d3fce1f595 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.057646] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Created folder: Instances in parent group-v894126. [ 900.057873] env[60024]: DEBUG oslo.service.loopingcall [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 900.058068] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 900.058268] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4d2b7be2-7080-47fa-9310-835815fde397 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.079035] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 900.079035] env[60024]: value = "task-4576288" [ 900.079035] env[60024]: _type = "Task" [ 900.079035] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 900.087948] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576288, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 900.589627] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576288, 'name': CreateVM_Task, 'duration_secs': 0.309741} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 900.589822] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 900.590489] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 900.590654] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 900.591032] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 900.591290] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1c9c9984-4503-4c60-be10-0f4241283943 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 900.596526] env[60024]: DEBUG oslo_vmware.api [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Waiting for the task: (returnval){ [ 900.596526] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]528cc431-6363-a220-6484-0d0f7888e30d" [ 900.596526] env[60024]: _type = "Task" [ 900.596526] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 900.606694] env[60024]: DEBUG oslo_vmware.api [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]528cc431-6363-a220-6484-0d0f7888e30d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 900.959852] env[60024]: DEBUG oslo_concurrency.lockutils [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Acquiring lock "e259637a-0fc8-4368-8a7a-c15a134ed17d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 900.960254] env[60024]: DEBUG oslo_concurrency.lockutils [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Lock "e259637a-0fc8-4368-8a7a-c15a134ed17d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 901.108760] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 901.109287] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 901.109637] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 901.761485] env[60024]: DEBUG nova.compute.manager [req-5b78a598-0bae-4c51-b60f-eee5640c4740 req-974aad81-43af-46a5-8c14-870923f74eec service nova] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Received event network-changed-00cb7570-258d-4c23-bcaf-72891c8b2671 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 901.761648] env[60024]: DEBUG nova.compute.manager [req-5b78a598-0bae-4c51-b60f-eee5640c4740 req-974aad81-43af-46a5-8c14-870923f74eec service nova] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Refreshing instance network info cache due to event network-changed-00cb7570-258d-4c23-bcaf-72891c8b2671. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 901.761847] env[60024]: DEBUG oslo_concurrency.lockutils [req-5b78a598-0bae-4c51-b60f-eee5640c4740 req-974aad81-43af-46a5-8c14-870923f74eec service nova] Acquiring lock "refresh_cache-fcf47169-eb7a-4644-bf3f-7150c44c247f" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 901.761996] env[60024]: DEBUG oslo_concurrency.lockutils [req-5b78a598-0bae-4c51-b60f-eee5640c4740 req-974aad81-43af-46a5-8c14-870923f74eec service nova] Acquired lock "refresh_cache-fcf47169-eb7a-4644-bf3f-7150c44c247f" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 901.762359] env[60024]: DEBUG nova.network.neutron [req-5b78a598-0bae-4c51-b60f-eee5640c4740 req-974aad81-43af-46a5-8c14-870923f74eec service nova] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Refreshing network info cache for port 00cb7570-258d-4c23-bcaf-72891c8b2671 {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 902.069885] env[60024]: DEBUG nova.network.neutron [req-5b78a598-0bae-4c51-b60f-eee5640c4740 req-974aad81-43af-46a5-8c14-870923f74eec service nova] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Updated VIF entry in instance network info cache for port 00cb7570-258d-4c23-bcaf-72891c8b2671. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 902.070293] env[60024]: DEBUG nova.network.neutron [req-5b78a598-0bae-4c51-b60f-eee5640c4740 req-974aad81-43af-46a5-8c14-870923f74eec service nova] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Updating instance_info_cache with network_info: [{"id": "00cb7570-258d-4c23-bcaf-72891c8b2671", "address": "fa:16:3e:9b:09:fc", "network": {"id": "1969bbb1-a8df-47e6-93ff-f836a3cdec1f", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-285142802-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bd0fcd8949d44cad8e75a3878331c428", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d88bb07-f93c-45ca-bce7-230cb1f33833", "external-id": "nsx-vlan-transportzone-387", "segmentation_id": 387, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap00cb7570-25", "ovs_interfaceid": "00cb7570-258d-4c23-bcaf-72891c8b2671", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 902.080225] env[60024]: DEBUG oslo_concurrency.lockutils [req-5b78a598-0bae-4c51-b60f-eee5640c4740 req-974aad81-43af-46a5-8c14-870923f74eec service nova] Releasing lock "refresh_cache-fcf47169-eb7a-4644-bf3f-7150c44c247f" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 937.343612] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 939.342469] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 939.353500] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 939.353736] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 939.353907] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 939.354079] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60024) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 939.355371] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1338821d-bfdd-4704-b367-96ecbe83fb4a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 939.364667] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b149aa31-97a5-43fc-9845-87035db40fdb {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 939.379017] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08e2880f-6143-44cd-84c6-80aaebfa8866 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 939.386039] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38883093-709c-4a32-aada-36ac9930ed5c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 939.417087] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180658MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60024) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 939.417257] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 939.417453] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 939.471975] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 37916d26-1b5e-4991-83a2-ca5a5b00c2ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 939.472503] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 9214a18f-c22d-4e24-980e-7241a2b993bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 939.472503] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 3ab1b905-cd6f-4d2b-a244-f85e56f796d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 939.472503] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 036d6de2-f69b-4714-b89e-9c4307253675 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 939.472503] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 5888cc9f-7341-4f9c-a93c-dd5ec95f7369 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 939.472690] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance fcf47169-eb7a-4644-bf3f-7150c44c247f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 939.484847] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance d55ee9a1-6921-4648-ace2-f2da13c3523e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 939.496775] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 939.508170] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 54ded864-1c3e-4a47-968f-ca597c82cb87 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 939.519304] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance e259637a-0fc8-4368-8a7a-c15a134ed17d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 939.519542] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 939.519688] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=100GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 939.655772] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06971a50-af59-4f0a-b67e-98d46256acf1 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 939.663788] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ac3ae44-b8b7-4fff-8406-2ee5534640ef {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 939.694226] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9be28759-766a-4261-abee-65a7ea45cdd3 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 939.702758] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd12967a-0fd9-4b78-92ed-dfec3c32ac39 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 939.718197] env[60024]: DEBUG nova.compute.provider_tree [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 939.726898] env[60024]: DEBUG nova.scheduler.client.report [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 939.740185] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60024) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 939.740385] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.323s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 940.740052] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 940.740052] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 941.341757] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 941.341996] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 942.337084] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 942.340596] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 942.340760] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Starting heal instance info cache {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 942.340872] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Rebuilding the list of instances to heal {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 942.357256] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 942.357413] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 942.357539] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 942.357829] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 942.357988] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 942.358130] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 942.358250] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Didn't find any instances for network info cache update. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 942.358690] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 942.358833] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60024) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 946.498076] env[60024]: WARNING oslo_vmware.rw_handles [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 946.498076] env[60024]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 946.498076] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 946.498076] env[60024]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 946.498076] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 946.498076] env[60024]: ERROR oslo_vmware.rw_handles response.begin() [ 946.498076] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 946.498076] env[60024]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 946.498076] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 946.498076] env[60024]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 946.498076] env[60024]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 946.498076] env[60024]: ERROR oslo_vmware.rw_handles [ 946.499164] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Downloaded image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to vmware_temp/1ce68d87-dd07-41fd-a9dd-f1eefec73bd8/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 946.500242] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Caching image {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 946.500519] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Copying Virtual Disk [datastore2] vmware_temp/1ce68d87-dd07-41fd-a9dd-f1eefec73bd8/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk to [datastore2] vmware_temp/1ce68d87-dd07-41fd-a9dd-f1eefec73bd8/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk {{(pid=60024) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 946.500797] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-81c418af-9862-4795-9636-3bfbbbe9150d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.509740] env[60024]: DEBUG oslo_vmware.api [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Waiting for the task: (returnval){ [ 946.509740] env[60024]: value = "task-4576289" [ 946.509740] env[60024]: _type = "Task" [ 946.509740] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 946.518427] env[60024]: DEBUG oslo_vmware.api [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Task: {'id': task-4576289, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 947.024144] env[60024]: DEBUG oslo_vmware.exceptions [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Fault InvalidArgument not matched. {{(pid=60024) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 947.024144] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 947.024833] env[60024]: ERROR nova.compute.manager [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 947.024833] env[60024]: Faults: ['InvalidArgument'] [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Traceback (most recent call last): [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] yield resources [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] self.driver.spawn(context, instance, image_meta, [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] self._fetch_image_if_missing(context, vi) [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] image_cache(vi, tmp_image_ds_loc) [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] vm_util.copy_virtual_disk( [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] session._wait_for_task(vmdk_copy_task) [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] return self.wait_for_task(task_ref) [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] return evt.wait() [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] result = hub.switch() [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] return self.greenlet.switch() [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] self.f(*self.args, **self.kw) [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] raise exceptions.translate_fault(task_info.error) [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Faults: ['InvalidArgument'] [ 947.024833] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] [ 947.024833] env[60024]: INFO nova.compute.manager [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Terminating instance [ 947.026426] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 947.026637] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 947.026878] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1c1f01f4-6c15-4105-8a26-0ebf73ef1dfe {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.029313] env[60024]: DEBUG nova.compute.manager [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 947.029513] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 947.030290] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79007fe8-193d-4989-88c9-324bb86b63d3 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.037881] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 947.038172] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e022825d-ace4-49ae-a76f-a59a6a259578 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.040524] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 947.040705] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 947.041664] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1687cf4c-10a2-4cdd-adfd-697533b6bded {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.047849] env[60024]: DEBUG oslo_vmware.api [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Waiting for the task: (returnval){ [ 947.047849] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]52d66c38-70f6-40c2-6b61-b9b0351147a7" [ 947.047849] env[60024]: _type = "Task" [ 947.047849] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 947.055423] env[60024]: DEBUG oslo_vmware.api [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]52d66c38-70f6-40c2-6b61-b9b0351147a7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 947.109325] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 947.109535] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 947.109650] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Deleting the datastore file [datastore2] 9214a18f-c22d-4e24-980e-7241a2b993bd {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 947.109985] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7b603e68-8b1f-4429-b10b-eb2d5b5ab271 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.117061] env[60024]: DEBUG oslo_vmware.api [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Waiting for the task: (returnval){ [ 947.117061] env[60024]: value = "task-4576291" [ 947.117061] env[60024]: _type = "Task" [ 947.117061] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 947.125363] env[60024]: DEBUG oslo_vmware.api [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Task: {'id': task-4576291, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 947.558772] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 947.559184] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Creating directory with path [datastore2] vmware_temp/093f466c-4b68-4d40-8465-c38da6c640dc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 947.559282] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-83d9f941-f838-440f-8833-3d3b79e18132 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.571457] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Created directory with path [datastore2] vmware_temp/093f466c-4b68-4d40-8465-c38da6c640dc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 947.571655] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Fetch image to [datastore2] vmware_temp/093f466c-4b68-4d40-8465-c38da6c640dc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 947.571824] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/093f466c-4b68-4d40-8465-c38da6c640dc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 947.572582] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf86934d-3067-49ce-83d4-d9e815a2406e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.579317] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b762c54-0b58-40ae-8f0d-1b2986872dc8 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.588303] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6681543c-280f-429c-b8ae-1c530e7b600e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.621824] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb76cf4b-2ccb-4037-8b96-fb2139ecc373 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.630925] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7736e27d-5647-4320-b212-249c254e8833 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.632701] env[60024]: DEBUG oslo_vmware.api [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Task: {'id': task-4576291, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069908} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 947.632935] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 947.633134] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 947.633307] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 947.633499] env[60024]: INFO nova.compute.manager [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Took 0.60 seconds to destroy the instance on the hypervisor. [ 947.635587] env[60024]: DEBUG nova.compute.claims [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 947.635776] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 947.636065] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 947.659461] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 947.706965] env[60024]: DEBUG oslo_vmware.rw_handles [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/093f466c-4b68-4d40-8465-c38da6c640dc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 947.764965] env[60024]: DEBUG oslo_vmware.rw_handles [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Completed reading data from the image iterator. {{(pid=60024) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 947.765183] env[60024]: DEBUG oslo_vmware.rw_handles [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/093f466c-4b68-4d40-8465-c38da6c640dc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 947.853218] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f13a490a-ac68-4447-9342-c05ee04fa791 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.861307] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48a25f72-5050-4425-bddb-eb6039d92529 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.891848] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68f5a7ff-3f28-458c-b12b-ac65f7a0c661 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.900166] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4df22bc-dbe0-4579-864b-ae374641c2e7 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 947.914921] env[60024]: DEBUG nova.compute.provider_tree [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 947.923268] env[60024]: DEBUG nova.scheduler.client.report [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 947.936276] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.300s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 947.936726] env[60024]: ERROR nova.compute.manager [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 947.936726] env[60024]: Faults: ['InvalidArgument'] [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Traceback (most recent call last): [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] self.driver.spawn(context, instance, image_meta, [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] self._fetch_image_if_missing(context, vi) [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] image_cache(vi, tmp_image_ds_loc) [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] vm_util.copy_virtual_disk( [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] session._wait_for_task(vmdk_copy_task) [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] return self.wait_for_task(task_ref) [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] return evt.wait() [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] result = hub.switch() [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] return self.greenlet.switch() [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] self.f(*self.args, **self.kw) [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] raise exceptions.translate_fault(task_info.error) [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Faults: ['InvalidArgument'] [ 947.936726] env[60024]: ERROR nova.compute.manager [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] [ 947.937697] env[60024]: DEBUG nova.compute.utils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] VimFaultException {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 947.938862] env[60024]: DEBUG nova.compute.manager [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Build of instance 9214a18f-c22d-4e24-980e-7241a2b993bd was re-scheduled: A specified parameter was not correct: fileType [ 947.938862] env[60024]: Faults: ['InvalidArgument'] {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 947.939243] env[60024]: DEBUG nova.compute.manager [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 947.939416] env[60024]: DEBUG nova.compute.manager [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 947.939585] env[60024]: DEBUG nova.compute.manager [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 947.939749] env[60024]: DEBUG nova.network.neutron [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 948.388326] env[60024]: DEBUG nova.network.neutron [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 948.399649] env[60024]: INFO nova.compute.manager [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Took 0.46 seconds to deallocate network for instance. [ 948.487727] env[60024]: INFO nova.scheduler.client.report [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Deleted allocations for instance 9214a18f-c22d-4e24-980e-7241a2b993bd [ 948.506190] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6a0399c7-51de-4d53-b231-40ed4691f878 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Lock "9214a18f-c22d-4e24-980e-7241a2b993bd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 369.687s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.506910] env[60024]: DEBUG oslo_concurrency.lockutils [None req-d7442ef0-80a3-46a4-b809-0339523a88a1 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Lock "9214a18f-c22d-4e24-980e-7241a2b993bd" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 169.601s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 948.507148] env[60024]: DEBUG oslo_concurrency.lockutils [None req-d7442ef0-80a3-46a4-b809-0339523a88a1 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Acquiring lock "9214a18f-c22d-4e24-980e-7241a2b993bd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 948.507369] env[60024]: DEBUG oslo_concurrency.lockutils [None req-d7442ef0-80a3-46a4-b809-0339523a88a1 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Lock "9214a18f-c22d-4e24-980e-7241a2b993bd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 948.507591] env[60024]: DEBUG oslo_concurrency.lockutils [None req-d7442ef0-80a3-46a4-b809-0339523a88a1 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Lock "9214a18f-c22d-4e24-980e-7241a2b993bd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.509584] env[60024]: INFO nova.compute.manager [None req-d7442ef0-80a3-46a4-b809-0339523a88a1 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Terminating instance [ 948.511031] env[60024]: DEBUG nova.compute.manager [None req-d7442ef0-80a3-46a4-b809-0339523a88a1 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 948.511260] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-d7442ef0-80a3-46a4-b809-0339523a88a1 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 948.511714] env[60024]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7ca42108-827e-4c13-9fda-e37fd0e692f0 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.521526] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06a7493d-6905-4510-9d6c-fe4b0ae79946 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.532296] env[60024]: DEBUG nova.compute.manager [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 948.554050] env[60024]: WARNING nova.virt.vmwareapi.vmops [None req-d7442ef0-80a3-46a4-b809-0339523a88a1 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9214a18f-c22d-4e24-980e-7241a2b993bd could not be found. [ 948.554328] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-d7442ef0-80a3-46a4-b809-0339523a88a1 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 948.554430] env[60024]: INFO nova.compute.manager [None req-d7442ef0-80a3-46a4-b809-0339523a88a1 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Took 0.04 seconds to destroy the instance on the hypervisor. [ 948.554688] env[60024]: DEBUG oslo.service.loopingcall [None req-d7442ef0-80a3-46a4-b809-0339523a88a1 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 948.554903] env[60024]: DEBUG nova.compute.manager [-] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 948.554998] env[60024]: DEBUG nova.network.neutron [-] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 948.577651] env[60024]: DEBUG nova.network.neutron [-] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 948.579229] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 948.579449] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 948.580867] env[60024]: INFO nova.compute.claims [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 948.584717] env[60024]: INFO nova.compute.manager [-] [instance: 9214a18f-c22d-4e24-980e-7241a2b993bd] Took 0.03 seconds to deallocate network for instance. [ 948.669522] env[60024]: DEBUG oslo_concurrency.lockutils [None req-d7442ef0-80a3-46a4-b809-0339523a88a1 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658 tempest-FloatingIPsAssociationNegativeTestJSON-1831931658-project-member] Lock "9214a18f-c22d-4e24-980e-7241a2b993bd" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.163s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.744871] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b3a726c-fb1d-41e2-af02-eb64482f52b1 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.753461] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38715ec0-82f5-4e8b-90eb-e8b403327c12 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.783840] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e5ecff5-58bd-4724-829d-1f01169236ce {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.792206] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-708614f5-e21b-4bfd-9f7f-cd5e82b6af39 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.806011] env[60024]: DEBUG nova.compute.provider_tree [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 948.814792] env[60024]: DEBUG nova.scheduler.client.report [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 948.827220] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.248s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 948.827750] env[60024]: DEBUG nova.compute.manager [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 948.860076] env[60024]: DEBUG nova.compute.utils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 948.861515] env[60024]: DEBUG nova.compute.manager [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 948.861747] env[60024]: DEBUG nova.network.neutron [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 948.871054] env[60024]: DEBUG nova.compute.manager [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 948.950371] env[60024]: DEBUG nova.compute.manager [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 948.967178] env[60024]: DEBUG nova.policy [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c06f3b2e0bd4459696b6724fa90f3809', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f0031e355e57421a8d48003a7eb717db', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 948.971350] env[60024]: DEBUG nova.virt.hardware [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 948.971605] env[60024]: DEBUG nova.virt.hardware [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 948.971767] env[60024]: DEBUG nova.virt.hardware [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 948.971949] env[60024]: DEBUG nova.virt.hardware [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 948.972110] env[60024]: DEBUG nova.virt.hardware [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 948.972261] env[60024]: DEBUG nova.virt.hardware [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 948.972466] env[60024]: DEBUG nova.virt.hardware [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 948.972627] env[60024]: DEBUG nova.virt.hardware [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 948.972789] env[60024]: DEBUG nova.virt.hardware [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 948.972947] env[60024]: DEBUG nova.virt.hardware [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 948.973133] env[60024]: DEBUG nova.virt.hardware [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 948.974009] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d826c9e9-331a-49bf-826f-d1e2ae84cd7f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 948.983649] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdaf560e-78f9-4f0f-8ffc-cd07e25cca73 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 949.398699] env[60024]: DEBUG nova.network.neutron [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Successfully created port: da4712f4-c2f6-4e13-b668-b54f62d5a679 {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 950.076381] env[60024]: DEBUG nova.network.neutron [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Successfully updated port: da4712f4-c2f6-4e13-b668-b54f62d5a679 {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 950.096226] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquiring lock "refresh_cache-d55ee9a1-6921-4648-ace2-f2da13c3523e" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 950.097497] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquired lock "refresh_cache-d55ee9a1-6921-4648-ace2-f2da13c3523e" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 950.097730] env[60024]: DEBUG nova.network.neutron [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 950.144349] env[60024]: DEBUG nova.network.neutron [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 950.696181] env[60024]: DEBUG nova.compute.manager [req-06add9b7-88cf-49ef-8318-d570a92ace54 req-ab9932e7-9d01-48b0-90fe-d9d22d9fdb6c service nova] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Received event network-vif-plugged-da4712f4-c2f6-4e13-b668-b54f62d5a679 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 950.696419] env[60024]: DEBUG oslo_concurrency.lockutils [req-06add9b7-88cf-49ef-8318-d570a92ace54 req-ab9932e7-9d01-48b0-90fe-d9d22d9fdb6c service nova] Acquiring lock "d55ee9a1-6921-4648-ace2-f2da13c3523e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 950.696628] env[60024]: DEBUG oslo_concurrency.lockutils [req-06add9b7-88cf-49ef-8318-d570a92ace54 req-ab9932e7-9d01-48b0-90fe-d9d22d9fdb6c service nova] Lock "d55ee9a1-6921-4648-ace2-f2da13c3523e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 950.696798] env[60024]: DEBUG oslo_concurrency.lockutils [req-06add9b7-88cf-49ef-8318-d570a92ace54 req-ab9932e7-9d01-48b0-90fe-d9d22d9fdb6c service nova] Lock "d55ee9a1-6921-4648-ace2-f2da13c3523e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 950.697062] env[60024]: DEBUG nova.compute.manager [req-06add9b7-88cf-49ef-8318-d570a92ace54 req-ab9932e7-9d01-48b0-90fe-d9d22d9fdb6c service nova] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] No waiting events found dispatching network-vif-plugged-da4712f4-c2f6-4e13-b668-b54f62d5a679 {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 950.697133] env[60024]: WARNING nova.compute.manager [req-06add9b7-88cf-49ef-8318-d570a92ace54 req-ab9932e7-9d01-48b0-90fe-d9d22d9fdb6c service nova] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Received unexpected event network-vif-plugged-da4712f4-c2f6-4e13-b668-b54f62d5a679 for instance with vm_state building and task_state spawning. [ 950.697407] env[60024]: DEBUG nova.compute.manager [req-06add9b7-88cf-49ef-8318-d570a92ace54 req-ab9932e7-9d01-48b0-90fe-d9d22d9fdb6c service nova] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Received event network-changed-da4712f4-c2f6-4e13-b668-b54f62d5a679 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 950.697466] env[60024]: DEBUG nova.compute.manager [req-06add9b7-88cf-49ef-8318-d570a92ace54 req-ab9932e7-9d01-48b0-90fe-d9d22d9fdb6c service nova] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Refreshing instance network info cache due to event network-changed-da4712f4-c2f6-4e13-b668-b54f62d5a679. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 950.698453] env[60024]: DEBUG oslo_concurrency.lockutils [req-06add9b7-88cf-49ef-8318-d570a92ace54 req-ab9932e7-9d01-48b0-90fe-d9d22d9fdb6c service nova] Acquiring lock "refresh_cache-d55ee9a1-6921-4648-ace2-f2da13c3523e" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 950.698569] env[60024]: DEBUG nova.network.neutron [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Updating instance_info_cache with network_info: [{"id": "da4712f4-c2f6-4e13-b668-b54f62d5a679", "address": "fa:16:3e:ee:a6:8a", "network": {"id": "2dc30c6e-f5d4-41f8-923f-52db38b510e8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.56", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a8236fdd83234f75a229055fe16f088d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3d31a554-a94c-4471-892f-f65aa87b8279", "external-id": "nsx-vlan-transportzone-241", "segmentation_id": 241, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapda4712f4-c2", "ovs_interfaceid": "da4712f4-c2f6-4e13-b668-b54f62d5a679", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 950.710279] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Releasing lock "refresh_cache-d55ee9a1-6921-4648-ace2-f2da13c3523e" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 950.710639] env[60024]: DEBUG nova.compute.manager [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Instance network_info: |[{"id": "da4712f4-c2f6-4e13-b668-b54f62d5a679", "address": "fa:16:3e:ee:a6:8a", "network": {"id": "2dc30c6e-f5d4-41f8-923f-52db38b510e8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.56", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a8236fdd83234f75a229055fe16f088d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3d31a554-a94c-4471-892f-f65aa87b8279", "external-id": "nsx-vlan-transportzone-241", "segmentation_id": 241, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapda4712f4-c2", "ovs_interfaceid": "da4712f4-c2f6-4e13-b668-b54f62d5a679", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 950.710910] env[60024]: DEBUG oslo_concurrency.lockutils [req-06add9b7-88cf-49ef-8318-d570a92ace54 req-ab9932e7-9d01-48b0-90fe-d9d22d9fdb6c service nova] Acquired lock "refresh_cache-d55ee9a1-6921-4648-ace2-f2da13c3523e" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 950.711105] env[60024]: DEBUG nova.network.neutron [req-06add9b7-88cf-49ef-8318-d570a92ace54 req-ab9932e7-9d01-48b0-90fe-d9d22d9fdb6c service nova] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Refreshing network info cache for port da4712f4-c2f6-4e13-b668-b54f62d5a679 {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 950.712122] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ee:a6:8a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3d31a554-a94c-4471-892f-f65aa87b8279', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'da4712f4-c2f6-4e13-b668-b54f62d5a679', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 950.720374] env[60024]: DEBUG oslo.service.loopingcall [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 950.721609] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 950.724052] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4bdf0c72-cb01-4cc1-80d5-a55d41fb5cc9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 950.747787] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 950.747787] env[60024]: value = "task-4576292" [ 950.747787] env[60024]: _type = "Task" [ 950.747787] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 950.758546] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576292, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 951.034541] env[60024]: DEBUG nova.network.neutron [req-06add9b7-88cf-49ef-8318-d570a92ace54 req-ab9932e7-9d01-48b0-90fe-d9d22d9fdb6c service nova] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Updated VIF entry in instance network info cache for port da4712f4-c2f6-4e13-b668-b54f62d5a679. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 951.034963] env[60024]: DEBUG nova.network.neutron [req-06add9b7-88cf-49ef-8318-d570a92ace54 req-ab9932e7-9d01-48b0-90fe-d9d22d9fdb6c service nova] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Updating instance_info_cache with network_info: [{"id": "da4712f4-c2f6-4e13-b668-b54f62d5a679", "address": "fa:16:3e:ee:a6:8a", "network": {"id": "2dc30c6e-f5d4-41f8-923f-52db38b510e8", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.56", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "a8236fdd83234f75a229055fe16f088d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3d31a554-a94c-4471-892f-f65aa87b8279", "external-id": "nsx-vlan-transportzone-241", "segmentation_id": 241, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapda4712f4-c2", "ovs_interfaceid": "da4712f4-c2f6-4e13-b668-b54f62d5a679", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 951.047172] env[60024]: DEBUG oslo_concurrency.lockutils [req-06add9b7-88cf-49ef-8318-d570a92ace54 req-ab9932e7-9d01-48b0-90fe-d9d22d9fdb6c service nova] Releasing lock "refresh_cache-d55ee9a1-6921-4648-ace2-f2da13c3523e" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 951.258656] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576292, 'name': CreateVM_Task, 'duration_secs': 0.335223} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 951.259651] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 951.259943] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 951.260148] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 951.260480] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 951.260924] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-43c85e7f-05c0-4d9f-86dd-03462da69836 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 951.266057] env[60024]: DEBUG oslo_vmware.api [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Waiting for the task: (returnval){ [ 951.266057] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]52c93d05-3697-7a1b-f7dd-e889fdb7e804" [ 951.266057] env[60024]: _type = "Task" [ 951.266057] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 951.276277] env[60024]: DEBUG oslo_vmware.api [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]52c93d05-3697-7a1b-f7dd-e889fdb7e804, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 951.778596] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 951.778596] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 951.778596] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 964.214175] env[60024]: DEBUG nova.compute.manager [req-3e073dfc-31d2-4170-9fe8-27a13afe9006 req-d73ac657-71fb-4c62-93a0-7f2136dbdf6b service nova] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Received event network-vif-deleted-00cb7570-258d-4c23-bcaf-72891c8b2671 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 993.516322] env[60024]: WARNING oslo_vmware.rw_handles [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 993.516322] env[60024]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 993.516322] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 993.516322] env[60024]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 993.516322] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 993.516322] env[60024]: ERROR oslo_vmware.rw_handles response.begin() [ 993.516322] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 993.516322] env[60024]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 993.516322] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 993.516322] env[60024]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 993.516322] env[60024]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 993.516322] env[60024]: ERROR oslo_vmware.rw_handles [ 993.516322] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Downloaded image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to vmware_temp/093f466c-4b68-4d40-8465-c38da6c640dc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 993.517050] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Caching image {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 993.517428] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Copying Virtual Disk [datastore2] vmware_temp/093f466c-4b68-4d40-8465-c38da6c640dc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk to [datastore2] vmware_temp/093f466c-4b68-4d40-8465-c38da6c640dc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk {{(pid=60024) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 993.517904] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-57a3001c-6746-4bbc-b602-1bd0489bde65 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 993.528190] env[60024]: DEBUG oslo_vmware.api [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Waiting for the task: (returnval){ [ 993.528190] env[60024]: value = "task-4576293" [ 993.528190] env[60024]: _type = "Task" [ 993.528190] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 993.537340] env[60024]: DEBUG oslo_vmware.api [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Task: {'id': task-4576293, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 994.039875] env[60024]: DEBUG oslo_vmware.exceptions [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Fault InvalidArgument not matched. {{(pid=60024) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 994.040487] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 994.041194] env[60024]: ERROR nova.compute.manager [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 994.041194] env[60024]: Faults: ['InvalidArgument'] [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Traceback (most recent call last): [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] yield resources [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] self.driver.spawn(context, instance, image_meta, [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] self._vmops.spawn(context, instance, image_meta, injected_files, [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] self._fetch_image_if_missing(context, vi) [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] image_cache(vi, tmp_image_ds_loc) [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] vm_util.copy_virtual_disk( [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] session._wait_for_task(vmdk_copy_task) [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] return self.wait_for_task(task_ref) [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] return evt.wait() [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] result = hub.switch() [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] return self.greenlet.switch() [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] self.f(*self.args, **self.kw) [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] raise exceptions.translate_fault(task_info.error) [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Faults: ['InvalidArgument'] [ 994.041194] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] [ 994.044050] env[60024]: INFO nova.compute.manager [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Terminating instance [ 994.045366] env[60024]: DEBUG nova.compute.manager [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 994.045736] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 994.046129] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 994.048783] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 994.048783] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88ae3291-a767-4b48-ba03-fcf1e43e858a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.053242] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5d00440f-b3e9-4f13-be8b-d34e310460d6 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.062911] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 994.065025] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-973caf0c-2454-440b-befa-f09e53958975 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.066384] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 994.066674] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 994.067465] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-26e5e523-0d8c-408a-ae7f-0793c064f342 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.075665] env[60024]: DEBUG oslo_vmware.api [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Waiting for the task: (returnval){ [ 994.075665] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]5243b4cd-672b-d8f9-2c36-2dfe3afa8969" [ 994.075665] env[60024]: _type = "Task" [ 994.075665] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 994.085415] env[60024]: DEBUG oslo_vmware.api [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]5243b4cd-672b-d8f9-2c36-2dfe3afa8969, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 994.219194] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 994.219194] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 994.219194] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Deleting the datastore file [datastore2] 37916d26-1b5e-4991-83a2-ca5a5b00c2ac {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 994.219194] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8e1f8355-028c-498b-a5f9-323ed12decbb {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.228858] env[60024]: DEBUG oslo_vmware.api [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Waiting for the task: (returnval){ [ 994.228858] env[60024]: value = "task-4576295" [ 994.228858] env[60024]: _type = "Task" [ 994.228858] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 994.242591] env[60024]: DEBUG oslo_vmware.api [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Task: {'id': task-4576295, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 994.592078] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 994.592078] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Creating directory with path [datastore2] vmware_temp/74faba69-45dd-47d5-ba24-7b71a7cdcd20/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 994.592078] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0c5be561-505e-4dea-b966-e9ffd10bf490 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.604438] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Created directory with path [datastore2] vmware_temp/74faba69-45dd-47d5-ba24-7b71a7cdcd20/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 994.604661] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Fetch image to [datastore2] vmware_temp/74faba69-45dd-47d5-ba24-7b71a7cdcd20/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 994.604895] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/74faba69-45dd-47d5-ba24-7b71a7cdcd20/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 994.605710] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-450d429d-2685-43fe-ac9a-1364e30bbd2a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.616166] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b78b5cc-e6ee-4801-8695-c21e60bff4c8 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.629529] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f31282f7-35af-476a-8eae-0a11b2d26f3d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.666299] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed788630-0dcd-45c0-a7e4-225a0038db70 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.676353] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7631ab78-172d-4c1a-8778-4d0269614d8e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.699647] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 994.739752] env[60024]: DEBUG oslo_vmware.api [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Task: {'id': task-4576295, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.097018} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 994.740031] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 994.740220] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 994.740391] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 994.740564] env[60024]: INFO nova.compute.manager [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Took 0.69 seconds to destroy the instance on the hypervisor. [ 994.743143] env[60024]: DEBUG nova.compute.claims [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 994.743313] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 994.743545] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 994.767220] env[60024]: DEBUG oslo_vmware.rw_handles [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/74faba69-45dd-47d5-ba24-7b71a7cdcd20/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 994.835810] env[60024]: DEBUG oslo_vmware.rw_handles [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Completed reading data from the image iterator. {{(pid=60024) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 994.835995] env[60024]: DEBUG oslo_vmware.rw_handles [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/74faba69-45dd-47d5-ba24-7b71a7cdcd20/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 994.958724] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2a4d10e-37e3-40c3-a787-59a879c3328c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 994.969736] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dac3f168-4f0e-4d43-a1ff-c13b183d5e6d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.005886] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-638d5b30-2049-4524-82a9-9fe8c03f0fb9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.014841] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-617e5d43-18f1-4a46-bcee-cfc0750d06c7 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 995.031196] env[60024]: DEBUG nova.compute.provider_tree [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 995.041731] env[60024]: DEBUG nova.scheduler.client.report [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 995.058123] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.314s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 995.058123] env[60024]: ERROR nova.compute.manager [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 995.058123] env[60024]: Faults: ['InvalidArgument'] [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Traceback (most recent call last): [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] self.driver.spawn(context, instance, image_meta, [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] self._vmops.spawn(context, instance, image_meta, injected_files, [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] self._fetch_image_if_missing(context, vi) [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] image_cache(vi, tmp_image_ds_loc) [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] vm_util.copy_virtual_disk( [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] session._wait_for_task(vmdk_copy_task) [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] return self.wait_for_task(task_ref) [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] return evt.wait() [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] result = hub.switch() [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] return self.greenlet.switch() [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] self.f(*self.args, **self.kw) [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] raise exceptions.translate_fault(task_info.error) [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Faults: ['InvalidArgument'] [ 995.058123] env[60024]: ERROR nova.compute.manager [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] [ 995.059349] env[60024]: DEBUG nova.compute.utils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] VimFaultException {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 995.060280] env[60024]: DEBUG nova.compute.manager [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Build of instance 37916d26-1b5e-4991-83a2-ca5a5b00c2ac was re-scheduled: A specified parameter was not correct: fileType [ 995.060280] env[60024]: Faults: ['InvalidArgument'] {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 995.060647] env[60024]: DEBUG nova.compute.manager [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 995.060820] env[60024]: DEBUG nova.compute.manager [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 995.060983] env[60024]: DEBUG nova.compute.manager [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 995.061158] env[60024]: DEBUG nova.network.neutron [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 996.357051] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 996.691283] env[60024]: DEBUG nova.network.neutron [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 996.701866] env[60024]: INFO nova.compute.manager [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Took 1.64 seconds to deallocate network for instance. [ 996.821568] env[60024]: INFO nova.scheduler.client.report [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Deleted allocations for instance 37916d26-1b5e-4991-83a2-ca5a5b00c2ac [ 996.838998] env[60024]: DEBUG oslo_concurrency.lockutils [None req-e59749df-cdb7-431a-98b4-6be65789904d tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Lock "37916d26-1b5e-4991-83a2-ca5a5b00c2ac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 424.275s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 996.840208] env[60024]: DEBUG oslo_concurrency.lockutils [None req-c1e49fd7-710f-4938-9c5f-2d292e3c1b4e tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Lock "37916d26-1b5e-4991-83a2-ca5a5b00c2ac" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 226.175s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 996.840339] env[60024]: DEBUG oslo_concurrency.lockutils [None req-c1e49fd7-710f-4938-9c5f-2d292e3c1b4e tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Acquiring lock "37916d26-1b5e-4991-83a2-ca5a5b00c2ac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 996.840538] env[60024]: DEBUG oslo_concurrency.lockutils [None req-c1e49fd7-710f-4938-9c5f-2d292e3c1b4e tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Lock "37916d26-1b5e-4991-83a2-ca5a5b00c2ac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 996.840695] env[60024]: DEBUG oslo_concurrency.lockutils [None req-c1e49fd7-710f-4938-9c5f-2d292e3c1b4e tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Lock "37916d26-1b5e-4991-83a2-ca5a5b00c2ac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 996.844220] env[60024]: INFO nova.compute.manager [None req-c1e49fd7-710f-4938-9c5f-2d292e3c1b4e tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Terminating instance [ 996.848885] env[60024]: DEBUG nova.compute.manager [None req-c1e49fd7-710f-4938-9c5f-2d292e3c1b4e tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 996.849559] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-c1e49fd7-710f-4938-9c5f-2d292e3c1b4e tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 996.849559] env[60024]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1c74340c-08ee-43a1-a527-49c8b8cf192a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.862362] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea72d5c1-b9f3-4339-bf33-af5746389129 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 996.874355] env[60024]: DEBUG nova.compute.manager [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 996.900878] env[60024]: WARNING nova.virt.vmwareapi.vmops [None req-c1e49fd7-710f-4938-9c5f-2d292e3c1b4e tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 37916d26-1b5e-4991-83a2-ca5a5b00c2ac could not be found. [ 996.901232] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-c1e49fd7-710f-4938-9c5f-2d292e3c1b4e tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 996.901796] env[60024]: INFO nova.compute.manager [None req-c1e49fd7-710f-4938-9c5f-2d292e3c1b4e tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Took 0.05 seconds to destroy the instance on the hypervisor. [ 996.901796] env[60024]: DEBUG oslo.service.loopingcall [None req-c1e49fd7-710f-4938-9c5f-2d292e3c1b4e tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 996.902016] env[60024]: DEBUG nova.compute.manager [-] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 996.902222] env[60024]: DEBUG nova.network.neutron [-] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 996.931448] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 996.931713] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 996.933212] env[60024]: INFO nova.compute.claims [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 996.959695] env[60024]: DEBUG nova.network.neutron [-] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 996.983123] env[60024]: INFO nova.compute.manager [-] [instance: 37916d26-1b5e-4991-83a2-ca5a5b00c2ac] Took 0.08 seconds to deallocate network for instance. [ 997.091231] env[60024]: DEBUG oslo_concurrency.lockutils [None req-c1e49fd7-710f-4938-9c5f-2d292e3c1b4e tempest-ServersTestMultiNic-1605036069 tempest-ServersTestMultiNic-1605036069-project-member] Lock "37916d26-1b5e-4991-83a2-ca5a5b00c2ac" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.251s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 997.122537] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a16f3bc4-1245-477a-85cc-9f8d7b85ddef {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.131545] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-924eb211-af36-4b39-b182-225f51c9a3ec {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.169027] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05bf56fa-53b8-44cc-adb8-50dabf3b9144 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.177972] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-273a72fc-d2df-4d61-9e28-8badec3d6d31 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.193351] env[60024]: DEBUG nova.compute.provider_tree [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 997.202914] env[60024]: DEBUG nova.scheduler.client.report [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 997.218941] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.287s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 997.219506] env[60024]: DEBUG nova.compute.manager [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 997.258999] env[60024]: DEBUG nova.compute.utils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 997.263726] env[60024]: DEBUG nova.compute.manager [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 997.263726] env[60024]: DEBUG nova.network.neutron [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 997.279928] env[60024]: DEBUG nova.compute.manager [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 997.359497] env[60024]: DEBUG nova.compute.manager [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 997.382965] env[60024]: DEBUG nova.virt.hardware [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 997.383223] env[60024]: DEBUG nova.virt.hardware [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 997.383380] env[60024]: DEBUG nova.virt.hardware [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 997.383597] env[60024]: DEBUG nova.virt.hardware [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 997.383803] env[60024]: DEBUG nova.virt.hardware [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 997.383883] env[60024]: DEBUG nova.virt.hardware [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 997.384754] env[60024]: DEBUG nova.virt.hardware [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 997.384927] env[60024]: DEBUG nova.virt.hardware [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 997.385126] env[60024]: DEBUG nova.virt.hardware [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 997.385303] env[60024]: DEBUG nova.virt.hardware [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 997.385469] env[60024]: DEBUG nova.virt.hardware [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 997.386395] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cdaa957-9a1b-402b-b5a2-94a7b4a4d0fb {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.398514] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e26f850-ae61-469c-98b2-0e3b7f8ff79b {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 997.418256] env[60024]: DEBUG nova.policy [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b0100cdbf36b40da85334a72b9121fc5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '60bf6fde086045b492b838eab8435479', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 998.394021] env[60024]: DEBUG nova.network.neutron [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Successfully created port: 544a8ec1-715f-48ac-a525-4a8b19ccf72f {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 999.341142] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 999.341416] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 999.355774] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 999.355974] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 999.356289] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 999.356356] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60024) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 999.357917] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c30b77e1-e43b-45b2-81f8-6decc33de220 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.368956] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00e95a56-3844-4593-89df-68dff04fff8f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.385621] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33ddee17-5ebd-41e5-b584-0daad2677eba {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.392520] env[60024]: DEBUG nova.network.neutron [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Successfully updated port: 544a8ec1-715f-48ac-a525-4a8b19ccf72f {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 999.396127] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c597769-0db8-4200-a935-c1d0ee0a25e3 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.400626] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Acquiring lock "refresh_cache-e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 999.400773] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Acquired lock "refresh_cache-e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 999.400921] env[60024]: DEBUG nova.network.neutron [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 999.432777] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180696MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60024) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 999.435195] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 999.435195] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 999.475066] env[60024]: DEBUG nova.network.neutron [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 999.507973] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 3ab1b905-cd6f-4d2b-a244-f85e56f796d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 999.510283] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 036d6de2-f69b-4714-b89e-9c4307253675 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 999.510283] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 5888cc9f-7341-4f9c-a93c-dd5ec95f7369 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 999.510283] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance d55ee9a1-6921-4648-ace2-f2da13c3523e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 999.510283] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 999.523276] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 54ded864-1c3e-4a47-968f-ca597c82cb87 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 999.538194] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance e259637a-0fc8-4368-8a7a-c15a134ed17d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 999.538493] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Total usable vcpus: 48, total allocated vcpus: 5 {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 999.538588] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1152MB phys_disk=100GB used_disk=5GB total_vcpus=48 used_vcpus=5 pci_stats=[] {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 999.673233] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a43139b2-0500-4849-a4d6-54bbf9bf975f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.682356] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9d76cd5-cbe3-4fdf-9d11-5f6e0783b304 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.689600] env[60024]: DEBUG nova.compute.manager [req-e5ec3f04-d343-4e6b-86a7-8c51bd63d852 req-7760e1e7-362d-4759-bdc4-e47e81d35317 service nova] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Received event network-vif-plugged-544a8ec1-715f-48ac-a525-4a8b19ccf72f {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 999.689740] env[60024]: DEBUG oslo_concurrency.lockutils [req-e5ec3f04-d343-4e6b-86a7-8c51bd63d852 req-7760e1e7-362d-4759-bdc4-e47e81d35317 service nova] Acquiring lock "e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 999.689953] env[60024]: DEBUG oslo_concurrency.lockutils [req-e5ec3f04-d343-4e6b-86a7-8c51bd63d852 req-7760e1e7-362d-4759-bdc4-e47e81d35317 service nova] Lock "e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 999.690135] env[60024]: DEBUG oslo_concurrency.lockutils [req-e5ec3f04-d343-4e6b-86a7-8c51bd63d852 req-7760e1e7-362d-4759-bdc4-e47e81d35317 service nova] Lock "e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 999.690298] env[60024]: DEBUG nova.compute.manager [req-e5ec3f04-d343-4e6b-86a7-8c51bd63d852 req-7760e1e7-362d-4759-bdc4-e47e81d35317 service nova] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] No waiting events found dispatching network-vif-plugged-544a8ec1-715f-48ac-a525-4a8b19ccf72f {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 999.690460] env[60024]: WARNING nova.compute.manager [req-e5ec3f04-d343-4e6b-86a7-8c51bd63d852 req-7760e1e7-362d-4759-bdc4-e47e81d35317 service nova] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Received unexpected event network-vif-plugged-544a8ec1-715f-48ac-a525-4a8b19ccf72f for instance with vm_state building and task_state spawning. [ 999.691045] env[60024]: DEBUG nova.compute.manager [req-e5ec3f04-d343-4e6b-86a7-8c51bd63d852 req-7760e1e7-362d-4759-bdc4-e47e81d35317 service nova] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Received event network-changed-544a8ec1-715f-48ac-a525-4a8b19ccf72f {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 999.691045] env[60024]: DEBUG nova.compute.manager [req-e5ec3f04-d343-4e6b-86a7-8c51bd63d852 req-7760e1e7-362d-4759-bdc4-e47e81d35317 service nova] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Refreshing instance network info cache due to event network-changed-544a8ec1-715f-48ac-a525-4a8b19ccf72f. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 999.691045] env[60024]: DEBUG oslo_concurrency.lockutils [req-e5ec3f04-d343-4e6b-86a7-8c51bd63d852 req-7760e1e7-362d-4759-bdc4-e47e81d35317 service nova] Acquiring lock "refresh_cache-e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 999.721771] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1856803-9592-4c6a-8390-7b7786953d87 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.732379] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5e6d6fd-3d69-413d-9e46-b45b96eef585 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.750285] env[60024]: DEBUG nova.compute.provider_tree [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 999.760526] env[60024]: DEBUG nova.scheduler.client.report [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 999.783125] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60024) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 999.783351] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.350s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 999.812697] env[60024]: DEBUG nova.network.neutron [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Updating instance_info_cache with network_info: [{"id": "544a8ec1-715f-48ac-a525-4a8b19ccf72f", "address": "fa:16:3e:7f:f1:0f", "network": {"id": "bdd0107f-8a2b-4984-904e-3fc8f15a2740", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-855850293-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "60bf6fde086045b492b838eab8435479", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea45c024-d603-4bac-9c1b-f302437ea4fe", "external-id": "nsx-vlan-transportzone-946", "segmentation_id": 946, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap544a8ec1-71", "ovs_interfaceid": "544a8ec1-715f-48ac-a525-4a8b19ccf72f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 999.830994] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Releasing lock "refresh_cache-e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 999.831341] env[60024]: DEBUG nova.compute.manager [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Instance network_info: |[{"id": "544a8ec1-715f-48ac-a525-4a8b19ccf72f", "address": "fa:16:3e:7f:f1:0f", "network": {"id": "bdd0107f-8a2b-4984-904e-3fc8f15a2740", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-855850293-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "60bf6fde086045b492b838eab8435479", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea45c024-d603-4bac-9c1b-f302437ea4fe", "external-id": "nsx-vlan-transportzone-946", "segmentation_id": 946, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap544a8ec1-71", "ovs_interfaceid": "544a8ec1-715f-48ac-a525-4a8b19ccf72f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 999.831622] env[60024]: DEBUG oslo_concurrency.lockutils [req-e5ec3f04-d343-4e6b-86a7-8c51bd63d852 req-7760e1e7-362d-4759-bdc4-e47e81d35317 service nova] Acquired lock "refresh_cache-e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 999.831794] env[60024]: DEBUG nova.network.neutron [req-e5ec3f04-d343-4e6b-86a7-8c51bd63d852 req-7760e1e7-362d-4759-bdc4-e47e81d35317 service nova] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Refreshing network info cache for port 544a8ec1-715f-48ac-a525-4a8b19ccf72f {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 999.833676] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7f:f1:0f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ea45c024-d603-4bac-9c1b-f302437ea4fe', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '544a8ec1-715f-48ac-a525-4a8b19ccf72f', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 999.841178] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Creating folder: Project (60bf6fde086045b492b838eab8435479). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 999.841894] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0b519148-e86b-4c2d-962f-c62674c9ce8a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.857043] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Created folder: Project (60bf6fde086045b492b838eab8435479) in parent group-v894073. [ 999.857263] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Creating folder: Instances. Parent ref: group-v894130. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 999.857503] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8ba4e6df-a345-4a38-822a-fdd723e33055 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.870515] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Created folder: Instances in parent group-v894130. [ 999.870768] env[60024]: DEBUG oslo.service.loopingcall [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 999.870968] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 999.871182] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6ee7c074-721e-4699-8081-22a88bff297f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.894889] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 999.894889] env[60024]: value = "task-4576298" [ 999.894889] env[60024]: _type = "Task" [ 999.894889] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 999.905163] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576298, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1000.406223] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576298, 'name': CreateVM_Task, 'duration_secs': 0.332034} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1000.408927] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1000.409711] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1000.409981] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1000.410367] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1000.410925] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0ed59246-893b-4f04-82b8-efe7b9d25a51 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.417340] env[60024]: DEBUG oslo_vmware.api [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Waiting for the task: (returnval){ [ 1000.417340] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]52d56f85-91dc-bd55-7361-9822d2531d25" [ 1000.417340] env[60024]: _type = "Task" [ 1000.417340] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1000.427944] env[60024]: DEBUG oslo_vmware.api [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]52d56f85-91dc-bd55-7361-9822d2531d25, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1000.514883] env[60024]: DEBUG nova.network.neutron [req-e5ec3f04-d343-4e6b-86a7-8c51bd63d852 req-7760e1e7-362d-4759-bdc4-e47e81d35317 service nova] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Updated VIF entry in instance network info cache for port 544a8ec1-715f-48ac-a525-4a8b19ccf72f. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1000.514883] env[60024]: DEBUG nova.network.neutron [req-e5ec3f04-d343-4e6b-86a7-8c51bd63d852 req-7760e1e7-362d-4759-bdc4-e47e81d35317 service nova] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Updating instance_info_cache with network_info: [{"id": "544a8ec1-715f-48ac-a525-4a8b19ccf72f", "address": "fa:16:3e:7f:f1:0f", "network": {"id": "bdd0107f-8a2b-4984-904e-3fc8f15a2740", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-855850293-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "60bf6fde086045b492b838eab8435479", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea45c024-d603-4bac-9c1b-f302437ea4fe", "external-id": "nsx-vlan-transportzone-946", "segmentation_id": 946, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap544a8ec1-71", "ovs_interfaceid": "544a8ec1-715f-48ac-a525-4a8b19ccf72f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1000.525614] env[60024]: DEBUG oslo_concurrency.lockutils [req-e5ec3f04-d343-4e6b-86a7-8c51bd63d852 req-7760e1e7-362d-4759-bdc4-e47e81d35317 service nova] Releasing lock "refresh_cache-e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1000.784316] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1000.932146] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1000.932146] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1000.932146] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1001.720801] env[60024]: DEBUG nova.compute.manager [req-7cfa90f1-ad9b-4fa4-b451-c2ce37c3d3ca req-ecbeb302-8583-4a6c-8e7b-f46263776ac8 service nova] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Received event network-vif-deleted-da4712f4-c2f6-4e13-b668-b54f62d5a679 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1002.342307] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1003.337244] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1003.342683] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1003.342683] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Starting heal instance info cache {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1003.342683] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Rebuilding the list of instances to heal {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1003.358019] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1003.358019] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1003.358019] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1003.358019] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1003.358019] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Didn't find any instances for network info cache update. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1003.358019] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1003.358019] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1003.359374] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1003.359374] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60024) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1008.906906] env[60024]: DEBUG oslo_concurrency.lockutils [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Acquiring lock "8b64034a-4d67-4605-adb8-a007dd735230" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1008.906906] env[60024]: DEBUG oslo_concurrency.lockutils [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Lock "8b64034a-4d67-4605-adb8-a007dd735230" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1041.383046] env[60024]: WARNING oslo_vmware.rw_handles [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1041.383046] env[60024]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1041.383046] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1041.383046] env[60024]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1041.383046] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1041.383046] env[60024]: ERROR oslo_vmware.rw_handles response.begin() [ 1041.383046] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1041.383046] env[60024]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1041.383046] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1041.383046] env[60024]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1041.383046] env[60024]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1041.383046] env[60024]: ERROR oslo_vmware.rw_handles [ 1041.383672] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Downloaded image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to vmware_temp/74faba69-45dd-47d5-ba24-7b71a7cdcd20/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1041.385442] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Caching image {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1041.385717] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Copying Virtual Disk [datastore2] vmware_temp/74faba69-45dd-47d5-ba24-7b71a7cdcd20/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk to [datastore2] vmware_temp/74faba69-45dd-47d5-ba24-7b71a7cdcd20/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk {{(pid=60024) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1041.386057] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ff8e82f9-6e7e-452e-b10b-db418171784e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1041.396596] env[60024]: DEBUG oslo_vmware.api [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Waiting for the task: (returnval){ [ 1041.396596] env[60024]: value = "task-4576299" [ 1041.396596] env[60024]: _type = "Task" [ 1041.396596] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1041.405826] env[60024]: DEBUG oslo_vmware.api [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Task: {'id': task-4576299, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1041.907067] env[60024]: DEBUG oslo_vmware.exceptions [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Fault InvalidArgument not matched. {{(pid=60024) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1041.907334] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1041.907861] env[60024]: ERROR nova.compute.manager [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1041.907861] env[60024]: Faults: ['InvalidArgument'] [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Traceback (most recent call last): [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] yield resources [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] self.driver.spawn(context, instance, image_meta, [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] self._fetch_image_if_missing(context, vi) [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] image_cache(vi, tmp_image_ds_loc) [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] vm_util.copy_virtual_disk( [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] session._wait_for_task(vmdk_copy_task) [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] return self.wait_for_task(task_ref) [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] return evt.wait() [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] result = hub.switch() [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] return self.greenlet.switch() [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] self.f(*self.args, **self.kw) [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] raise exceptions.translate_fault(task_info.error) [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Faults: ['InvalidArgument'] [ 1041.907861] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] [ 1041.909035] env[60024]: INFO nova.compute.manager [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Terminating instance [ 1041.909737] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1041.909941] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1041.910182] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2b9f952d-6252-49cd-85ca-caffa91b2de1 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1041.912603] env[60024]: DEBUG nova.compute.manager [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1041.912800] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1041.913515] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c298ee67-5f33-4129-ab03-7b6940ae5a23 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1041.920696] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1041.920935] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1e71ea9f-59f5-469a-9734-4fcaacf2021a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1041.923206] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1041.923352] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1041.924337] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6e0c0cee-22a2-46b2-96cb-dfc15efc7e55 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1041.929934] env[60024]: DEBUG oslo_vmware.api [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Waiting for the task: (returnval){ [ 1041.929934] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]5220ad35-8115-2e70-31d5-bec7a7c2c53e" [ 1041.929934] env[60024]: _type = "Task" [ 1041.929934] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1041.937403] env[60024]: DEBUG oslo_vmware.api [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]5220ad35-8115-2e70-31d5-bec7a7c2c53e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1041.993379] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1041.993596] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1041.993838] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Deleting the datastore file [datastore2] 3ab1b905-cd6f-4d2b-a244-f85e56f796d3 {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1041.994129] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1685b28a-6937-4f5c-819c-7bbd03b36310 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.001894] env[60024]: DEBUG oslo_vmware.api [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Waiting for the task: (returnval){ [ 1042.001894] env[60024]: value = "task-4576301" [ 1042.001894] env[60024]: _type = "Task" [ 1042.001894] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1042.012684] env[60024]: DEBUG oslo_vmware.api [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Task: {'id': task-4576301, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1042.440828] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1042.441204] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Creating directory with path [datastore2] vmware_temp/62e83edb-3cf7-474c-b54d-f018e4fec1a7/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1042.441342] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3ce47c62-03e5-407e-86e3-563d787d4537 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.453816] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Created directory with path [datastore2] vmware_temp/62e83edb-3cf7-474c-b54d-f018e4fec1a7/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1042.454062] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Fetch image to [datastore2] vmware_temp/62e83edb-3cf7-474c-b54d-f018e4fec1a7/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1042.454258] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/62e83edb-3cf7-474c-b54d-f018e4fec1a7/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1042.455164] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35a21a6b-8d5d-45d5-97b6-2556c0c247ad {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.462656] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-599bf35f-8181-4738-9586-7569a4d318f9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.472012] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-868660cd-f739-4331-8c94-41e593ade5ec {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.502327] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87738e3c-9007-48ae-8c0e-0ee0b7123915 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.514110] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-01d51774-c57b-4d85-b4ba-0f8c468fa24f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.515821] env[60024]: DEBUG oslo_vmware.api [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Task: {'id': task-4576301, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081815} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1042.516065] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1042.516282] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1042.516455] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1042.516624] env[60024]: INFO nova.compute.manager [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1042.518782] env[60024]: DEBUG nova.compute.claims [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1042.518950] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1042.519197] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1042.608112] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1042.655234] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bff07fd-0c4d-4cde-b8f8-d7227d01d111 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.659075] env[60024]: DEBUG oslo_vmware.rw_handles [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/62e83edb-3cf7-474c-b54d-f018e4fec1a7/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1042.715751] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c38a861a-e390-445d-94ab-5ff2ce073200 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.720940] env[60024]: DEBUG oslo_vmware.rw_handles [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Completed reading data from the image iterator. {{(pid=60024) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1042.721181] env[60024]: DEBUG oslo_vmware.rw_handles [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/62e83edb-3cf7-474c-b54d-f018e4fec1a7/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1042.748045] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-134554f8-03a1-4c07-bbe4-3265788401ad {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.755951] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbbce069-f49e-4b87-86fa-58e7b9a43b1a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1042.769083] env[60024]: DEBUG nova.compute.provider_tree [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1042.777969] env[60024]: DEBUG nova.scheduler.client.report [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1042.793127] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.274s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1042.793653] env[60024]: ERROR nova.compute.manager [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1042.793653] env[60024]: Faults: ['InvalidArgument'] [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Traceback (most recent call last): [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] self.driver.spawn(context, instance, image_meta, [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] self._fetch_image_if_missing(context, vi) [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] image_cache(vi, tmp_image_ds_loc) [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] vm_util.copy_virtual_disk( [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] session._wait_for_task(vmdk_copy_task) [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] return self.wait_for_task(task_ref) [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] return evt.wait() [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] result = hub.switch() [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] return self.greenlet.switch() [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] self.f(*self.args, **self.kw) [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] raise exceptions.translate_fault(task_info.error) [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Faults: ['InvalidArgument'] [ 1042.793653] env[60024]: ERROR nova.compute.manager [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] [ 1042.794863] env[60024]: DEBUG nova.compute.utils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] VimFaultException {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1042.795792] env[60024]: DEBUG nova.compute.manager [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Build of instance 3ab1b905-cd6f-4d2b-a244-f85e56f796d3 was re-scheduled: A specified parameter was not correct: fileType [ 1042.795792] env[60024]: Faults: ['InvalidArgument'] {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1042.796183] env[60024]: DEBUG nova.compute.manager [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1042.796354] env[60024]: DEBUG nova.compute.manager [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1042.796518] env[60024]: DEBUG nova.compute.manager [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1042.796676] env[60024]: DEBUG nova.network.neutron [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1043.075776] env[60024]: DEBUG nova.network.neutron [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1043.085709] env[60024]: INFO nova.compute.manager [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Took 0.29 seconds to deallocate network for instance. [ 1043.171143] env[60024]: INFO nova.scheduler.client.report [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Deleted allocations for instance 3ab1b905-cd6f-4d2b-a244-f85e56f796d3 [ 1043.187763] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6f634d99-9457-48c6-a6d2-9ded8cd6c6c3 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Lock "3ab1b905-cd6f-4d2b-a244-f85e56f796d3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 461.459s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1043.188871] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f3ed154b-4ecd-4d42-9310-d1d2294386f0 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Lock "3ab1b905-cd6f-4d2b-a244-f85e56f796d3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 263.148s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1043.189100] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f3ed154b-4ecd-4d42-9310-d1d2294386f0 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Acquiring lock "3ab1b905-cd6f-4d2b-a244-f85e56f796d3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1043.189310] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f3ed154b-4ecd-4d42-9310-d1d2294386f0 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Lock "3ab1b905-cd6f-4d2b-a244-f85e56f796d3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1043.189479] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f3ed154b-4ecd-4d42-9310-d1d2294386f0 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Lock "3ab1b905-cd6f-4d2b-a244-f85e56f796d3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1043.191715] env[60024]: INFO nova.compute.manager [None req-f3ed154b-4ecd-4d42-9310-d1d2294386f0 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Terminating instance [ 1043.193890] env[60024]: DEBUG nova.compute.manager [None req-f3ed154b-4ecd-4d42-9310-d1d2294386f0 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1043.194149] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-f3ed154b-4ecd-4d42-9310-d1d2294386f0 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1043.194410] env[60024]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6b034e8c-f2b7-4c43-800e-b55beeb20b61 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.204547] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f42a9603-b57f-4382-9d7b-d44064396c97 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.216987] env[60024]: DEBUG nova.compute.manager [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1043.238660] env[60024]: WARNING nova.virt.vmwareapi.vmops [None req-f3ed154b-4ecd-4d42-9310-d1d2294386f0 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3ab1b905-cd6f-4d2b-a244-f85e56f796d3 could not be found. [ 1043.238976] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-f3ed154b-4ecd-4d42-9310-d1d2294386f0 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1043.239079] env[60024]: INFO nova.compute.manager [None req-f3ed154b-4ecd-4d42-9310-d1d2294386f0 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1043.239251] env[60024]: DEBUG oslo.service.loopingcall [None req-f3ed154b-4ecd-4d42-9310-d1d2294386f0 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1043.239480] env[60024]: DEBUG nova.compute.manager [-] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1043.239574] env[60024]: DEBUG nova.network.neutron [-] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1043.262999] env[60024]: DEBUG nova.network.neutron [-] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1043.265932] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1043.266186] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1043.267715] env[60024]: INFO nova.compute.claims [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1043.270997] env[60024]: INFO nova.compute.manager [-] [instance: 3ab1b905-cd6f-4d2b-a244-f85e56f796d3] Took 0.03 seconds to deallocate network for instance. [ 1043.359039] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f3ed154b-4ecd-4d42-9310-d1d2294386f0 tempest-ServersAdminTestJSON-1213463885 tempest-ServersAdminTestJSON-1213463885-project-member] Lock "3ab1b905-cd6f-4d2b-a244-f85e56f796d3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.170s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1043.401268] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-804781eb-7b7a-4905-bc92-8a9a00648458 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.410030] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9900a43-22e2-46f3-9894-7c398a477bdc {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.441022] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-831b6fb3-2f15-4d38-8ffd-199ee6d6fe45 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.449078] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-772275d2-9e77-499e-a7cd-bc9218db9463 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.464503] env[60024]: DEBUG nova.compute.provider_tree [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1043.472872] env[60024]: DEBUG nova.scheduler.client.report [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1043.486027] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.219s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1043.486272] env[60024]: DEBUG nova.compute.manager [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1043.520512] env[60024]: DEBUG nova.compute.utils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1043.521810] env[60024]: DEBUG nova.compute.manager [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1043.522020] env[60024]: DEBUG nova.network.neutron [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1043.530549] env[60024]: DEBUG nova.compute.manager [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1043.580053] env[60024]: DEBUG nova.policy [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '885cac70fdc44efaaa3b2f0e1bfb2907', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9fb45b1c3ac3465c8b7737a90a8a968e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 1043.593417] env[60024]: DEBUG nova.compute.manager [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1043.614257] env[60024]: DEBUG nova.virt.hardware [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1043.614579] env[60024]: DEBUG nova.virt.hardware [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1043.614671] env[60024]: DEBUG nova.virt.hardware [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1043.614854] env[60024]: DEBUG nova.virt.hardware [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1043.615008] env[60024]: DEBUG nova.virt.hardware [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1043.615161] env[60024]: DEBUG nova.virt.hardware [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1043.615366] env[60024]: DEBUG nova.virt.hardware [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1043.615523] env[60024]: DEBUG nova.virt.hardware [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1043.615687] env[60024]: DEBUG nova.virt.hardware [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1043.615841] env[60024]: DEBUG nova.virt.hardware [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1043.616019] env[60024]: DEBUG nova.virt.hardware [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1043.616910] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fec6bb60-818b-4e8a-a979-ab19d8b79f3b {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.625659] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b572001-5158-4403-b186-859ff225806d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1043.961847] env[60024]: DEBUG nova.network.neutron [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Successfully created port: c3dd8b40-8209-48ef-8948-1141969c9c9f {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1044.805216] env[60024]: DEBUG nova.network.neutron [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Successfully updated port: c3dd8b40-8209-48ef-8948-1141969c9c9f {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1044.814596] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Acquiring lock "refresh_cache-54ded864-1c3e-4a47-968f-ca597c82cb87" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1044.814596] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Acquired lock "refresh_cache-54ded864-1c3e-4a47-968f-ca597c82cb87" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1044.814596] env[60024]: DEBUG nova.network.neutron [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1044.853565] env[60024]: DEBUG nova.network.neutron [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1045.059702] env[60024]: DEBUG nova.network.neutron [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Updating instance_info_cache with network_info: [{"id": "c3dd8b40-8209-48ef-8948-1141969c9c9f", "address": "fa:16:3e:43:d5:29", "network": {"id": "890cc710-5cb3-4f7b-b1d5-a435937e41b3", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1360700645-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9fb45b1c3ac3465c8b7737a90a8a968e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "53ebf5df-5ecb-4a0c-a163-d88165639de0", "external-id": "nsx-vlan-transportzone-588", "segmentation_id": 588, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc3dd8b40-82", "ovs_interfaceid": "c3dd8b40-8209-48ef-8948-1141969c9c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1045.070023] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Releasing lock "refresh_cache-54ded864-1c3e-4a47-968f-ca597c82cb87" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1045.070330] env[60024]: DEBUG nova.compute.manager [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Instance network_info: |[{"id": "c3dd8b40-8209-48ef-8948-1141969c9c9f", "address": "fa:16:3e:43:d5:29", "network": {"id": "890cc710-5cb3-4f7b-b1d5-a435937e41b3", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1360700645-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9fb45b1c3ac3465c8b7737a90a8a968e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "53ebf5df-5ecb-4a0c-a163-d88165639de0", "external-id": "nsx-vlan-transportzone-588", "segmentation_id": 588, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc3dd8b40-82", "ovs_interfaceid": "c3dd8b40-8209-48ef-8948-1141969c9c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1045.070690] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:43:d5:29', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '53ebf5df-5ecb-4a0c-a163-d88165639de0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c3dd8b40-8209-48ef-8948-1141969c9c9f', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1045.078310] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Creating folder: Project (9fb45b1c3ac3465c8b7737a90a8a968e). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1045.078862] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f0c26e84-c321-4f83-a09d-c6af1b349ad0 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.094952] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Created folder: Project (9fb45b1c3ac3465c8b7737a90a8a968e) in parent group-v894073. [ 1045.095192] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Creating folder: Instances. Parent ref: group-v894133. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1045.095441] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9f7fc585-6ef3-4b30-a090-61955603fc7d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.106348] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Created folder: Instances in parent group-v894133. [ 1045.106559] env[60024]: DEBUG oslo.service.loopingcall [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1045.106743] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1045.106934] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a2c41f52-4d54-4a78-8d8a-cf83dbcacd3a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1045.123337] env[60024]: DEBUG nova.compute.manager [req-25b08dc1-0cbe-48d2-af80-30d56d562982 req-c2e905bf-3f8a-4b99-b250-5bb6e3a1ef68 service nova] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Received event network-vif-plugged-c3dd8b40-8209-48ef-8948-1141969c9c9f {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1045.123538] env[60024]: DEBUG oslo_concurrency.lockutils [req-25b08dc1-0cbe-48d2-af80-30d56d562982 req-c2e905bf-3f8a-4b99-b250-5bb6e3a1ef68 service nova] Acquiring lock "54ded864-1c3e-4a47-968f-ca597c82cb87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1045.123741] env[60024]: DEBUG oslo_concurrency.lockutils [req-25b08dc1-0cbe-48d2-af80-30d56d562982 req-c2e905bf-3f8a-4b99-b250-5bb6e3a1ef68 service nova] Lock "54ded864-1c3e-4a47-968f-ca597c82cb87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1045.123898] env[60024]: DEBUG oslo_concurrency.lockutils [req-25b08dc1-0cbe-48d2-af80-30d56d562982 req-c2e905bf-3f8a-4b99-b250-5bb6e3a1ef68 service nova] Lock "54ded864-1c3e-4a47-968f-ca597c82cb87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1045.124144] env[60024]: DEBUG nova.compute.manager [req-25b08dc1-0cbe-48d2-af80-30d56d562982 req-c2e905bf-3f8a-4b99-b250-5bb6e3a1ef68 service nova] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] No waiting events found dispatching network-vif-plugged-c3dd8b40-8209-48ef-8948-1141969c9c9f {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1045.124333] env[60024]: WARNING nova.compute.manager [req-25b08dc1-0cbe-48d2-af80-30d56d562982 req-c2e905bf-3f8a-4b99-b250-5bb6e3a1ef68 service nova] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Received unexpected event network-vif-plugged-c3dd8b40-8209-48ef-8948-1141969c9c9f for instance with vm_state building and task_state spawning. [ 1045.124494] env[60024]: DEBUG nova.compute.manager [req-25b08dc1-0cbe-48d2-af80-30d56d562982 req-c2e905bf-3f8a-4b99-b250-5bb6e3a1ef68 service nova] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Received event network-changed-c3dd8b40-8209-48ef-8948-1141969c9c9f {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1045.124647] env[60024]: DEBUG nova.compute.manager [req-25b08dc1-0cbe-48d2-af80-30d56d562982 req-c2e905bf-3f8a-4b99-b250-5bb6e3a1ef68 service nova] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Refreshing instance network info cache due to event network-changed-c3dd8b40-8209-48ef-8948-1141969c9c9f. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1045.124827] env[60024]: DEBUG oslo_concurrency.lockutils [req-25b08dc1-0cbe-48d2-af80-30d56d562982 req-c2e905bf-3f8a-4b99-b250-5bb6e3a1ef68 service nova] Acquiring lock "refresh_cache-54ded864-1c3e-4a47-968f-ca597c82cb87" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1045.124960] env[60024]: DEBUG oslo_concurrency.lockutils [req-25b08dc1-0cbe-48d2-af80-30d56d562982 req-c2e905bf-3f8a-4b99-b250-5bb6e3a1ef68 service nova] Acquired lock "refresh_cache-54ded864-1c3e-4a47-968f-ca597c82cb87" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1045.125127] env[60024]: DEBUG nova.network.neutron [req-25b08dc1-0cbe-48d2-af80-30d56d562982 req-c2e905bf-3f8a-4b99-b250-5bb6e3a1ef68 service nova] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Refreshing network info cache for port c3dd8b40-8209-48ef-8948-1141969c9c9f {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1045.132847] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1045.132847] env[60024]: value = "task-4576304" [ 1045.132847] env[60024]: _type = "Task" [ 1045.132847] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1045.143203] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576304, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1045.379893] env[60024]: DEBUG nova.network.neutron [req-25b08dc1-0cbe-48d2-af80-30d56d562982 req-c2e905bf-3f8a-4b99-b250-5bb6e3a1ef68 service nova] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Updated VIF entry in instance network info cache for port c3dd8b40-8209-48ef-8948-1141969c9c9f. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1045.380287] env[60024]: DEBUG nova.network.neutron [req-25b08dc1-0cbe-48d2-af80-30d56d562982 req-c2e905bf-3f8a-4b99-b250-5bb6e3a1ef68 service nova] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Updating instance_info_cache with network_info: [{"id": "c3dd8b40-8209-48ef-8948-1141969c9c9f", "address": "fa:16:3e:43:d5:29", "network": {"id": "890cc710-5cb3-4f7b-b1d5-a435937e41b3", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1360700645-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9fb45b1c3ac3465c8b7737a90a8a968e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "53ebf5df-5ecb-4a0c-a163-d88165639de0", "external-id": "nsx-vlan-transportzone-588", "segmentation_id": 588, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc3dd8b40-82", "ovs_interfaceid": "c3dd8b40-8209-48ef-8948-1141969c9c9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1045.389467] env[60024]: DEBUG oslo_concurrency.lockutils [req-25b08dc1-0cbe-48d2-af80-30d56d562982 req-c2e905bf-3f8a-4b99-b250-5bb6e3a1ef68 service nova] Releasing lock "refresh_cache-54ded864-1c3e-4a47-968f-ca597c82cb87" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1045.642919] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576304, 'name': CreateVM_Task} progress is 99%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1046.143425] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576304, 'name': CreateVM_Task} progress is 99%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1046.644695] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576304, 'name': CreateVM_Task, 'duration_secs': 1.309844} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1046.644894] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1046.645683] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1046.645859] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1046.646214] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1046.646490] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ead4ae68-a52c-464f-90c8-aeb6b395593b {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1046.651571] env[60024]: DEBUG oslo_vmware.api [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Waiting for the task: (returnval){ [ 1046.651571] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]522ee539-6dc4-57d5-027c-e55bbaf04f1d" [ 1046.651571] env[60024]: _type = "Task" [ 1046.651571] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1046.659781] env[60024]: DEBUG oslo_vmware.api [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]522ee539-6dc4-57d5-027c-e55bbaf04f1d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1047.162517] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1047.162804] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1047.162971] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1060.341995] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1060.342358] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1060.353714] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1060.353954] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1060.354162] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1060.354334] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60024) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1060.355513] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79e50903-c82a-41fc-af65-8af3290c121b {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1060.365096] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ab9f891-7642-40d3-9f21-fd3995d96083 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1060.380409] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9517a71b-738e-4392-9ef8-2bd4ddf6430c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1060.387932] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87781878-4844-49ff-962e-8cb318164f96 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1060.420046] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180698MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60024) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1060.420225] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1060.420427] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1060.466396] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 036d6de2-f69b-4714-b89e-9c4307253675 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1060.466562] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 5888cc9f-7341-4f9c-a93c-dd5ec95f7369 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1060.466688] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1060.466811] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 54ded864-1c3e-4a47-968f-ca597c82cb87 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1060.477957] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance e259637a-0fc8-4368-8a7a-c15a134ed17d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1060.488108] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 8b64034a-4d67-4605-adb8-a007dd735230 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1060.488324] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1060.488467] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=100GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1060.568731] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8267d410-e01f-43f3-84ef-2189b1d44075 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1060.577194] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e39de47-ac03-44f1-b2c4-d7bc00d92656 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1060.607140] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-544d5629-67b2-4888-abb4-6fa9dd4c454f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1060.615216] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28c823ae-71c0-48c3-851d-5782839489ec {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1060.630062] env[60024]: DEBUG nova.compute.provider_tree [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1060.639155] env[60024]: DEBUG nova.scheduler.client.report [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1060.652147] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60024) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1060.652401] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.232s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1061.652533] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1063.341081] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1063.341448] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1063.341492] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60024) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1064.337490] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1064.341187] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1064.341507] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Starting heal instance info cache {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1064.341507] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Rebuilding the list of instances to heal {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1064.355449] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1064.355609] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1064.355734] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1064.355859] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1064.355983] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Didn't find any instances for network info cache update. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1064.356500] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1064.356679] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1082.033285] env[60024]: DEBUG nova.compute.manager [req-0a74927e-457f-4048-8c87-66e066cc242f req-55d858e0-427b-4838-a6f5-b857d71b4c88 service nova] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Received event network-vif-deleted-544a8ec1-715f-48ac-a525-4a8b19ccf72f {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1084.062716] env[60024]: DEBUG nova.compute.manager [req-37dda041-c023-40b0-9fd6-598e07970e6a req-6f7aa735-99e1-4062-8247-809ac5d3115f service nova] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Received event network-vif-deleted-c3dd8b40-8209-48ef-8948-1141969c9c9f {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1092.390961] env[60024]: WARNING oslo_vmware.rw_handles [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1092.390961] env[60024]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1092.390961] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1092.390961] env[60024]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1092.390961] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1092.390961] env[60024]: ERROR oslo_vmware.rw_handles response.begin() [ 1092.390961] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1092.390961] env[60024]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1092.390961] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1092.390961] env[60024]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1092.390961] env[60024]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1092.390961] env[60024]: ERROR oslo_vmware.rw_handles [ 1092.391550] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Downloaded image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to vmware_temp/62e83edb-3cf7-474c-b54d-f018e4fec1a7/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1092.393568] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Caching image {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1092.393859] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Copying Virtual Disk [datastore2] vmware_temp/62e83edb-3cf7-474c-b54d-f018e4fec1a7/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk to [datastore2] vmware_temp/62e83edb-3cf7-474c-b54d-f018e4fec1a7/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk {{(pid=60024) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1092.394199] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6fa5bb2e-c995-4f7b-b37f-f047d0b50e4e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.403470] env[60024]: DEBUG oslo_vmware.api [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Waiting for the task: (returnval){ [ 1092.403470] env[60024]: value = "task-4576305" [ 1092.403470] env[60024]: _type = "Task" [ 1092.403470] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1092.412300] env[60024]: DEBUG oslo_vmware.api [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Task: {'id': task-4576305, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1092.914590] env[60024]: DEBUG oslo_vmware.exceptions [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Fault InvalidArgument not matched. {{(pid=60024) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1092.914870] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1092.915430] env[60024]: ERROR nova.compute.manager [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1092.915430] env[60024]: Faults: ['InvalidArgument'] [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Traceback (most recent call last): [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] yield resources [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] self.driver.spawn(context, instance, image_meta, [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] self._fetch_image_if_missing(context, vi) [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] image_cache(vi, tmp_image_ds_loc) [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] vm_util.copy_virtual_disk( [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] session._wait_for_task(vmdk_copy_task) [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] return self.wait_for_task(task_ref) [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] return evt.wait() [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] result = hub.switch() [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] return self.greenlet.switch() [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] self.f(*self.args, **self.kw) [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] raise exceptions.translate_fault(task_info.error) [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Faults: ['InvalidArgument'] [ 1092.915430] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] [ 1092.916685] env[60024]: INFO nova.compute.manager [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Terminating instance [ 1092.917322] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1092.917542] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1092.918160] env[60024]: DEBUG nova.compute.manager [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1092.918351] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1092.918600] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-69f061fd-b160-4ca8-970f-3a0247c4ae7d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.920983] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-601d3467-79ab-45f3-a217-96a05919008c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.928536] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1092.928744] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-45cc5c49-7c56-4d01-b91d-2689cd70724c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.930944] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1092.931132] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1092.932067] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-509a4496-11f6-4fe8-8747-b1834d84e46e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1092.938033] env[60024]: DEBUG oslo_vmware.api [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Waiting for the task: (returnval){ [ 1092.938033] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]5217c957-5b97-b573-a7b5-ce66e52d52df" [ 1092.938033] env[60024]: _type = "Task" [ 1092.938033] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1092.950435] env[60024]: DEBUG oslo_vmware.api [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]5217c957-5b97-b573-a7b5-ce66e52d52df, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1093.004074] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1093.004379] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1093.004586] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Deleting the datastore file [datastore2] 036d6de2-f69b-4714-b89e-9c4307253675 {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1093.004967] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7d65a828-b734-40bb-b3d3-9e604c065680 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.012575] env[60024]: DEBUG oslo_vmware.api [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Waiting for the task: (returnval){ [ 1093.012575] env[60024]: value = "task-4576307" [ 1093.012575] env[60024]: _type = "Task" [ 1093.012575] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1093.021300] env[60024]: DEBUG oslo_vmware.api [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Task: {'id': task-4576307, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1093.449017] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1093.449300] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Creating directory with path [datastore2] vmware_temp/20b33431-277a-4f8d-80cb-302aabe36840/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1093.449536] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b9b23231-dc70-4b1c-8245-3a35d3a72987 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.462511] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Created directory with path [datastore2] vmware_temp/20b33431-277a-4f8d-80cb-302aabe36840/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1093.462714] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Fetch image to [datastore2] vmware_temp/20b33431-277a-4f8d-80cb-302aabe36840/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1093.462885] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/20b33431-277a-4f8d-80cb-302aabe36840/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1093.463662] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82c1877c-d4bc-49bc-9ada-11b9e383f07e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.470792] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e39cefeb-27ee-465e-a412-6c4c35954351 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.480361] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a1ba68f-625c-4253-99ca-1413f1b10e5a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.512117] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d83e7229-3349-41f6-b466-f9cb3d3048ea {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.524398] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-06b80d20-f081-45cc-aef4-dcd2aeccad60 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.526143] env[60024]: DEBUG oslo_vmware.api [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Task: {'id': task-4576307, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081542} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1093.526376] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1093.526561] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1093.526727] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1093.526897] env[60024]: INFO nova.compute.manager [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1093.529613] env[60024]: DEBUG nova.compute.claims [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1093.529800] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1093.530106] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1093.557836] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1093.616086] env[60024]: DEBUG oslo_vmware.rw_handles [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/20b33431-277a-4f8d-80cb-302aabe36840/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1093.674426] env[60024]: DEBUG oslo_vmware.rw_handles [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Completed reading data from the image iterator. {{(pid=60024) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1093.674675] env[60024]: DEBUG oslo_vmware.rw_handles [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/20b33431-277a-4f8d-80cb-302aabe36840/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1093.683344] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-072c2e65-9c14-4d7f-8b23-359c9b901bf7 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.691435] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e2f2605-b430-4c9b-b370-32c7fb74b0aa {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.721589] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63feecae-1d66-4018-a9bb-964670173434 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.729097] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78d78a3c-182e-4469-a86d-f7c1a1064243 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1093.742398] env[60024]: DEBUG nova.compute.provider_tree [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1093.751050] env[60024]: DEBUG nova.scheduler.client.report [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1093.765911] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.236s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1093.766434] env[60024]: ERROR nova.compute.manager [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1093.766434] env[60024]: Faults: ['InvalidArgument'] [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Traceback (most recent call last): [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] self.driver.spawn(context, instance, image_meta, [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] self._fetch_image_if_missing(context, vi) [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] image_cache(vi, tmp_image_ds_loc) [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] vm_util.copy_virtual_disk( [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] session._wait_for_task(vmdk_copy_task) [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] return self.wait_for_task(task_ref) [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] return evt.wait() [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] result = hub.switch() [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] return self.greenlet.switch() [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] self.f(*self.args, **self.kw) [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] raise exceptions.translate_fault(task_info.error) [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Faults: ['InvalidArgument'] [ 1093.766434] env[60024]: ERROR nova.compute.manager [instance: 036d6de2-f69b-4714-b89e-9c4307253675] [ 1093.767319] env[60024]: DEBUG nova.compute.utils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] VimFaultException {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1093.768562] env[60024]: DEBUG nova.compute.manager [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Build of instance 036d6de2-f69b-4714-b89e-9c4307253675 was re-scheduled: A specified parameter was not correct: fileType [ 1093.768562] env[60024]: Faults: ['InvalidArgument'] {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1093.768922] env[60024]: DEBUG nova.compute.manager [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1093.769106] env[60024]: DEBUG nova.compute.manager [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1093.769276] env[60024]: DEBUG nova.compute.manager [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1093.769447] env[60024]: DEBUG nova.network.neutron [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1094.038128] env[60024]: DEBUG nova.network.neutron [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1094.049016] env[60024]: INFO nova.compute.manager [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Took 0.28 seconds to deallocate network for instance. [ 1094.138766] env[60024]: INFO nova.scheduler.client.report [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Deleted allocations for instance 036d6de2-f69b-4714-b89e-9c4307253675 [ 1094.155604] env[60024]: DEBUG oslo_concurrency.lockutils [None req-f8a499f7-7ae1-4eaf-9aab-36371b84c6c5 tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Lock "036d6de2-f69b-4714-b89e-9c4307253675" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 511.104s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1094.156782] env[60024]: DEBUG oslo_concurrency.lockutils [None req-46b464b6-d994-4738-a3f3-32f7541088fe tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Lock "036d6de2-f69b-4714-b89e-9c4307253675" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 313.482s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1094.157038] env[60024]: DEBUG oslo_concurrency.lockutils [None req-46b464b6-d994-4738-a3f3-32f7541088fe tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Acquiring lock "036d6de2-f69b-4714-b89e-9c4307253675-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1094.157249] env[60024]: DEBUG oslo_concurrency.lockutils [None req-46b464b6-d994-4738-a3f3-32f7541088fe tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Lock "036d6de2-f69b-4714-b89e-9c4307253675-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1094.157413] env[60024]: DEBUG oslo_concurrency.lockutils [None req-46b464b6-d994-4738-a3f3-32f7541088fe tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Lock "036d6de2-f69b-4714-b89e-9c4307253675-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1094.160187] env[60024]: INFO nova.compute.manager [None req-46b464b6-d994-4738-a3f3-32f7541088fe tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Terminating instance [ 1094.161973] env[60024]: DEBUG nova.compute.manager [None req-46b464b6-d994-4738-a3f3-32f7541088fe tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1094.162188] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-46b464b6-d994-4738-a3f3-32f7541088fe tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1094.162694] env[60024]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-408a98e1-ed52-4c81-8b3d-8764f8d238c6 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.172734] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf42b703-f0b5-4867-af49-6b3de4477bdf {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.186438] env[60024]: DEBUG nova.compute.manager [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1094.213740] env[60024]: WARNING nova.virt.vmwareapi.vmops [None req-46b464b6-d994-4738-a3f3-32f7541088fe tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 036d6de2-f69b-4714-b89e-9c4307253675 could not be found. [ 1094.213969] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-46b464b6-d994-4738-a3f3-32f7541088fe tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1094.214195] env[60024]: INFO nova.compute.manager [None req-46b464b6-d994-4738-a3f3-32f7541088fe tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1094.214496] env[60024]: DEBUG oslo.service.loopingcall [None req-46b464b6-d994-4738-a3f3-32f7541088fe tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1094.214721] env[60024]: DEBUG nova.compute.manager [-] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1094.214816] env[60024]: DEBUG nova.network.neutron [-] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1094.245122] env[60024]: DEBUG oslo_concurrency.lockutils [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1094.245395] env[60024]: DEBUG oslo_concurrency.lockutils [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1094.247118] env[60024]: INFO nova.compute.claims [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1094.270448] env[60024]: DEBUG nova.network.neutron [-] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1094.279247] env[60024]: INFO nova.compute.manager [-] [instance: 036d6de2-f69b-4714-b89e-9c4307253675] Took 0.06 seconds to deallocate network for instance. [ 1094.351834] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c870ca61-f689-458d-873b-6c32cb059851 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.360281] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04a4201b-d370-42cb-a850-733b5ff3edd6 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.394601] env[60024]: DEBUG oslo_concurrency.lockutils [None req-46b464b6-d994-4738-a3f3-32f7541088fe tempest-VolumesAssistedSnapshotsTest-2138134395 tempest-VolumesAssistedSnapshotsTest-2138134395-project-member] Lock "036d6de2-f69b-4714-b89e-9c4307253675" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.238s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1094.396223] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d54a62c9-9b93-4d6c-81be-0d185479bae2 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.406065] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2273b4f9-0e87-4051-b6fc-7a5001e155e9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.422661] env[60024]: DEBUG nova.compute.provider_tree [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1094.431561] env[60024]: DEBUG nova.scheduler.client.report [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1094.446393] env[60024]: DEBUG oslo_concurrency.lockutils [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.201s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1094.446793] env[60024]: DEBUG nova.compute.manager [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1094.486179] env[60024]: DEBUG nova.compute.utils [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1094.487471] env[60024]: DEBUG nova.compute.manager [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1094.487654] env[60024]: DEBUG nova.network.neutron [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1094.496538] env[60024]: DEBUG nova.compute.manager [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1094.531356] env[60024]: INFO nova.virt.block_device [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Booting with volume a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129 at /dev/sda [ 1094.554696] env[60024]: DEBUG nova.policy [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '46b1172c3120415cb62749437bbff1ce', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '50f58441c5ba4f09a505f350adf21708', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 1094.573969] env[60024]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0d55cbbe-eddb-4cc2-8b9d-bbef7c610f14 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.582861] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e13ba8b-d5f3-48c1-abb6-f138ba8a976f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.613700] env[60024]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2edae956-ad3b-4faa-9b28-fa3a1d7a7198 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.621914] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f36a2b68-4e55-41a7-898f-b85d7e101fff {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.651329] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1db3881-8134-4876-b6f6-1ad98af8bfcb {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.658399] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61f9039f-741a-4fb0-b859-4bbddb55870b {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.672323] env[60024]: DEBUG nova.virt.block_device [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Updating existing volume attachment record: 22944502-06d8-4caa-9d68-ae0dd4b27001 {{(pid=60024) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 1094.863350] env[60024]: DEBUG nova.network.neutron [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Successfully created port: fa6a7ab3-e871-4f58-a10a-87b794889876 {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1094.896513] env[60024]: DEBUG nova.compute.manager [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1094.897064] env[60024]: DEBUG nova.virt.hardware [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1094.897278] env[60024]: DEBUG nova.virt.hardware [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1094.897433] env[60024]: DEBUG nova.virt.hardware [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1094.897609] env[60024]: DEBUG nova.virt.hardware [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1094.897749] env[60024]: DEBUG nova.virt.hardware [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1094.897891] env[60024]: DEBUG nova.virt.hardware [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1094.898108] env[60024]: DEBUG nova.virt.hardware [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1094.898268] env[60024]: DEBUG nova.virt.hardware [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1094.898434] env[60024]: DEBUG nova.virt.hardware [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1094.898596] env[60024]: DEBUG nova.virt.hardware [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1094.901860] env[60024]: DEBUG nova.virt.hardware [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1094.901860] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67e6d828-588a-4a8f-afc1-cab3bdfe0300 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1094.911188] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1be2d7ed-9bc1-44f4-9611-3bf20436d7b4 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.513675] env[60024]: DEBUG nova.network.neutron [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Successfully updated port: fa6a7ab3-e871-4f58-a10a-87b794889876 {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1095.522920] env[60024]: DEBUG oslo_concurrency.lockutils [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Acquiring lock "refresh_cache-e259637a-0fc8-4368-8a7a-c15a134ed17d" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1095.523109] env[60024]: DEBUG oslo_concurrency.lockutils [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Acquired lock "refresh_cache-e259637a-0fc8-4368-8a7a-c15a134ed17d" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1095.523268] env[60024]: DEBUG nova.network.neutron [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1095.572591] env[60024]: DEBUG nova.network.neutron [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1095.735936] env[60024]: DEBUG nova.network.neutron [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Updating instance_info_cache with network_info: [{"id": "fa6a7ab3-e871-4f58-a10a-87b794889876", "address": "fa:16:3e:a9:db:4a", "network": {"id": "32f4c1f3-c181-47fe-8380-10964564ec21", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1648434312-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "50f58441c5ba4f09a505f350adf21708", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "471f65a5-21ea-45e3-a722-4e204ed65673", "external-id": "nsx-vlan-transportzone-139", "segmentation_id": 139, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfa6a7ab3-e8", "ovs_interfaceid": "fa6a7ab3-e871-4f58-a10a-87b794889876", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1095.750373] env[60024]: DEBUG oslo_concurrency.lockutils [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Releasing lock "refresh_cache-e259637a-0fc8-4368-8a7a-c15a134ed17d" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1095.750675] env[60024]: DEBUG nova.compute.manager [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Instance network_info: |[{"id": "fa6a7ab3-e871-4f58-a10a-87b794889876", "address": "fa:16:3e:a9:db:4a", "network": {"id": "32f4c1f3-c181-47fe-8380-10964564ec21", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1648434312-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "50f58441c5ba4f09a505f350adf21708", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "471f65a5-21ea-45e3-a722-4e204ed65673", "external-id": "nsx-vlan-transportzone-139", "segmentation_id": 139, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfa6a7ab3-e8", "ovs_interfaceid": "fa6a7ab3-e871-4f58-a10a-87b794889876", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1095.751054] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a9:db:4a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '471f65a5-21ea-45e3-a722-4e204ed65673', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'fa6a7ab3-e871-4f58-a10a-87b794889876', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1095.758841] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Creating folder: Project (50f58441c5ba4f09a505f350adf21708). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1095.759381] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-21ef529f-1ba8-4a3a-b574-303f08ea89ac {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.774191] env[60024]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 1095.774413] env[60024]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=60024) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 1095.774938] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Folder already exists: Project (50f58441c5ba4f09a505f350adf21708). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 1095.775152] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Creating folder: Instances. Parent ref: group-v894122. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1095.775409] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-abc7c062-5f1d-415b-8e99-f0892d24a002 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.786681] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Created folder: Instances in parent group-v894122. [ 1095.786944] env[60024]: DEBUG oslo.service.loopingcall [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1095.787153] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1095.787359] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bab27ef6-0cb8-4426-a3ac-c975d45e9770 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1095.807204] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1095.807204] env[60024]: value = "task-4576310" [ 1095.807204] env[60024]: _type = "Task" [ 1095.807204] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1095.815106] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576310, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1096.078159] env[60024]: DEBUG nova.compute.manager [req-e86525dc-4f38-4582-9484-aaeeb5d2eabd req-690fe30f-1f49-49a5-a619-91d98d22bc55 service nova] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Received event network-vif-plugged-fa6a7ab3-e871-4f58-a10a-87b794889876 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1096.078391] env[60024]: DEBUG oslo_concurrency.lockutils [req-e86525dc-4f38-4582-9484-aaeeb5d2eabd req-690fe30f-1f49-49a5-a619-91d98d22bc55 service nova] Acquiring lock "e259637a-0fc8-4368-8a7a-c15a134ed17d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1096.078663] env[60024]: DEBUG oslo_concurrency.lockutils [req-e86525dc-4f38-4582-9484-aaeeb5d2eabd req-690fe30f-1f49-49a5-a619-91d98d22bc55 service nova] Lock "e259637a-0fc8-4368-8a7a-c15a134ed17d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1096.078875] env[60024]: DEBUG oslo_concurrency.lockutils [req-e86525dc-4f38-4582-9484-aaeeb5d2eabd req-690fe30f-1f49-49a5-a619-91d98d22bc55 service nova] Lock "e259637a-0fc8-4368-8a7a-c15a134ed17d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1096.079102] env[60024]: DEBUG nova.compute.manager [req-e86525dc-4f38-4582-9484-aaeeb5d2eabd req-690fe30f-1f49-49a5-a619-91d98d22bc55 service nova] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] No waiting events found dispatching network-vif-plugged-fa6a7ab3-e871-4f58-a10a-87b794889876 {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1096.079297] env[60024]: WARNING nova.compute.manager [req-e86525dc-4f38-4582-9484-aaeeb5d2eabd req-690fe30f-1f49-49a5-a619-91d98d22bc55 service nova] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Received unexpected event network-vif-plugged-fa6a7ab3-e871-4f58-a10a-87b794889876 for instance with vm_state building and task_state spawning. [ 1096.079484] env[60024]: DEBUG nova.compute.manager [req-e86525dc-4f38-4582-9484-aaeeb5d2eabd req-690fe30f-1f49-49a5-a619-91d98d22bc55 service nova] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Received event network-changed-fa6a7ab3-e871-4f58-a10a-87b794889876 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1096.079664] env[60024]: DEBUG nova.compute.manager [req-e86525dc-4f38-4582-9484-aaeeb5d2eabd req-690fe30f-1f49-49a5-a619-91d98d22bc55 service nova] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Refreshing instance network info cache due to event network-changed-fa6a7ab3-e871-4f58-a10a-87b794889876. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1096.079854] env[60024]: DEBUG oslo_concurrency.lockutils [req-e86525dc-4f38-4582-9484-aaeeb5d2eabd req-690fe30f-1f49-49a5-a619-91d98d22bc55 service nova] Acquiring lock "refresh_cache-e259637a-0fc8-4368-8a7a-c15a134ed17d" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1096.080193] env[60024]: DEBUG oslo_concurrency.lockutils [req-e86525dc-4f38-4582-9484-aaeeb5d2eabd req-690fe30f-1f49-49a5-a619-91d98d22bc55 service nova] Acquired lock "refresh_cache-e259637a-0fc8-4368-8a7a-c15a134ed17d" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1096.080193] env[60024]: DEBUG nova.network.neutron [req-e86525dc-4f38-4582-9484-aaeeb5d2eabd req-690fe30f-1f49-49a5-a619-91d98d22bc55 service nova] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Refreshing network info cache for port fa6a7ab3-e871-4f58-a10a-87b794889876 {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1096.311621] env[60024]: DEBUG nova.network.neutron [req-e86525dc-4f38-4582-9484-aaeeb5d2eabd req-690fe30f-1f49-49a5-a619-91d98d22bc55 service nova] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Updated VIF entry in instance network info cache for port fa6a7ab3-e871-4f58-a10a-87b794889876. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1096.311982] env[60024]: DEBUG nova.network.neutron [req-e86525dc-4f38-4582-9484-aaeeb5d2eabd req-690fe30f-1f49-49a5-a619-91d98d22bc55 service nova] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Updating instance_info_cache with network_info: [{"id": "fa6a7ab3-e871-4f58-a10a-87b794889876", "address": "fa:16:3e:a9:db:4a", "network": {"id": "32f4c1f3-c181-47fe-8380-10964564ec21", "bridge": "br-int", "label": "tempest-ServersTestBootFromVolume-1648434312-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "50f58441c5ba4f09a505f350adf21708", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "471f65a5-21ea-45e3-a722-4e204ed65673", "external-id": "nsx-vlan-transportzone-139", "segmentation_id": 139, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapfa6a7ab3-e8", "ovs_interfaceid": "fa6a7ab3-e871-4f58-a10a-87b794889876", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1096.319570] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576310, 'name': CreateVM_Task, 'duration_secs': 0.315808} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1096.319740] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1096.320323] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-894125', 'volume_id': 'a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129', 'name': 'volume-a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'e259637a-0fc8-4368-8a7a-c15a134ed17d', 'attached_at': '', 'detached_at': '', 'volume_id': 'a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129', 'serial': 'a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129'}, 'mount_device': '/dev/sda', 'guest_format': None, 'disk_bus': None, 'boot_index': 0, 'device_type': None, 'delete_on_termination': True, 'attachment_id': '22944502-06d8-4caa-9d68-ae0dd4b27001', 'volume_type': None}], 'swap': None} {{(pid=60024) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 1096.320545] env[60024]: DEBUG nova.virt.vmwareapi.volumeops [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Root volume attach. Driver type: vmdk {{(pid=60024) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 1096.321861] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d965279-6e10-427e-802c-a02588d87954 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.324535] env[60024]: DEBUG oslo_concurrency.lockutils [req-e86525dc-4f38-4582-9484-aaeeb5d2eabd req-690fe30f-1f49-49a5-a619-91d98d22bc55 service nova] Releasing lock "refresh_cache-e259637a-0fc8-4368-8a7a-c15a134ed17d" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1096.330927] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46ec1c37-89d5-4e25-b49f-dcb6f8d0dfb9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.337262] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f67da8d9-47b2-4abf-b4f7-1572b63a5a63 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.343613] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-79de263f-4957-4127-98e2-2c828d29120d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.350996] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Waiting for the task: (returnval){ [ 1096.350996] env[60024]: value = "task-4576311" [ 1096.350996] env[60024]: _type = "Task" [ 1096.350996] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1096.358606] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Task: {'id': task-4576311, 'name': RelocateVM_Task} progress is 5%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1096.861390] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Task: {'id': task-4576311, 'name': RelocateVM_Task, 'duration_secs': 0.368323} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1096.861802] env[60024]: DEBUG nova.virt.vmwareapi.volumeops [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Volume attach. Driver type: vmdk {{(pid=60024) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 1096.861878] env[60024]: DEBUG nova.virt.vmwareapi.volumeops [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-894125', 'volume_id': 'a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129', 'name': 'volume-a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'e259637a-0fc8-4368-8a7a-c15a134ed17d', 'attached_at': '', 'detached_at': '', 'volume_id': 'a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129', 'serial': 'a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129'} {{(pid=60024) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 1096.862704] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92d6c0f4-2d08-42a5-bddb-dd87b21473fd {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.881265] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bcd84fe-d75e-4c41-8a0d-e9388fdf85b3 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.904015] env[60024]: DEBUG nova.virt.vmwareapi.volumeops [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Reconfiguring VM instance instance-0000001c to attach disk [datastore2] volume-a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129/volume-a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129.vmdk or device None with type thin {{(pid=60024) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 1096.904282] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-0b2a29a3-332a-4ab4-a6e9-eaafeb07dc0a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1096.925893] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Waiting for the task: (returnval){ [ 1096.925893] env[60024]: value = "task-4576312" [ 1096.925893] env[60024]: _type = "Task" [ 1096.925893] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1096.934192] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Task: {'id': task-4576312, 'name': ReconfigVM_Task} progress is 5%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1097.436645] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Task: {'id': task-4576312, 'name': ReconfigVM_Task, 'duration_secs': 0.288887} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1097.437064] env[60024]: DEBUG nova.virt.vmwareapi.volumeops [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Reconfigured VM instance instance-0000001c to attach disk [datastore2] volume-a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129/volume-a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129.vmdk or device None with type thin {{(pid=60024) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 1097.441585] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-87dd1d98-21a8-4ce8-9eda-f2c9e5a808c2 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1097.459862] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Waiting for the task: (returnval){ [ 1097.459862] env[60024]: value = "task-4576314" [ 1097.459862] env[60024]: _type = "Task" [ 1097.459862] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1097.469611] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Task: {'id': task-4576314, 'name': ReconfigVM_Task} progress is 5%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1097.970999] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Task: {'id': task-4576314, 'name': ReconfigVM_Task} progress is 99%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1098.108464] env[60024]: DEBUG nova.compute.manager [req-004f7b4f-cb28-41de-81b4-881429755208 req-e86531b6-dbf3-4685-8bfd-6c0b0c8d310d service nova] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Received event network-vif-deleted-fa6a7ab3-e871-4f58-a10a-87b794889876 {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1098.473100] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Task: {'id': task-4576314, 'name': ReconfigVM_Task} progress is 99%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1098.973218] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Task: {'id': task-4576314, 'name': ReconfigVM_Task, 'duration_secs': 1.122515} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1098.973579] env[60024]: DEBUG nova.virt.vmwareapi.volumeops [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-894125', 'volume_id': 'a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129', 'name': 'volume-a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'e259637a-0fc8-4368-8a7a-c15a134ed17d', 'attached_at': '', 'detached_at': '', 'volume_id': 'a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129', 'serial': 'a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129'} {{(pid=60024) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 1098.974084] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-96f4e62b-c0aa-4698-b034-da71ffe19b88 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1098.981810] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Waiting for the task: (returnval){ [ 1098.981810] env[60024]: value = "task-4576315" [ 1098.981810] env[60024]: _type = "Task" [ 1098.981810] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1098.999760] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Task: {'id': task-4576315, 'name': Rename_Task} progress is 6%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1099.493982] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Task: {'id': task-4576315, 'name': Rename_Task, 'duration_secs': 0.122518} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1099.494273] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Powering on the VM {{(pid=60024) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 1099.494564] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-2c7cc68e-e80a-43df-be75-e02cf000f626 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1099.501953] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Waiting for the task: (returnval){ [ 1099.501953] env[60024]: value = "task-4576316" [ 1099.501953] env[60024]: _type = "Task" [ 1099.501953] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1099.510110] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Task: {'id': task-4576316, 'name': PowerOnVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1100.014906] env[60024]: ERROR nova.compute.manager [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Instance failed to spawn: oslo_vmware.exceptions.FileNotFoundException: File /vmfs/volumes/4e2c0edf-dfee12b7-0000-000000000000/volume-a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129/volume-a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129.vmdk was not found [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Traceback (most recent call last): [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] yield resources [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] self.driver.spawn(context, instance, image_meta, [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 836, in spawn [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] vm_util.power_on_instance(self._session, instance, vm_ref=vm_ref) [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1447, in power_on_instance [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] session._wait_for_task(poweron_task) [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] return self.wait_for_task(task_ref) [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] return evt.wait() [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] result = hub.switch() [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] return self.greenlet.switch() [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] self.f(*self.args, **self.kw) [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] raise exceptions.translate_fault(task_info.error) [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] oslo_vmware.exceptions.FileNotFoundException: File /vmfs/volumes/4e2c0edf-dfee12b7-0000-000000000000/volume-a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129/volume-a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129.vmdk was not found [ 1100.014906] env[60024]: ERROR nova.compute.manager [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] [ 1100.016040] env[60024]: INFO nova.compute.manager [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Terminating instance [ 1100.017892] env[60024]: DEBUG nova.compute.manager [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1100.018115] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Powering off the VM {{(pid=60024) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 1100.018338] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-e1ce095c-afbe-4ceb-a0ae-5a1839da3f50 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.025299] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Waiting for the task: (returnval){ [ 1100.025299] env[60024]: value = "task-4576317" [ 1100.025299] env[60024]: _type = "Task" [ 1100.025299] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1100.034295] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] VM already powered off {{(pid=60024) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1509}} [ 1100.034526] env[60024]: DEBUG nova.virt.vmwareapi.volumeops [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Volume detach. Driver type: vmdk {{(pid=60024) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 1100.034717] env[60024]: DEBUG nova.virt.vmwareapi.volumeops [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-894125', 'volume_id': 'a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129', 'name': 'volume-a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'e259637a-0fc8-4368-8a7a-c15a134ed17d', 'attached_at': '', 'detached_at': '', 'volume_id': 'a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129', 'serial': 'a68c7f7d-bdca-4eb3-98b9-e7d6e3a78129'} {{(pid=60024) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 1100.035464] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1093dc05-20b6-4d38-aa21-f143b8242429 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.053912] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd769f80-0f91-4322-934b-66f3a8579c5e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.061684] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3648bc1e-27ca-43d0-a4b6-f300538ea839 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.079163] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b04eed4-5a1a-4ed7-b8c6-e2ea5ffeea09 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.086851] env[60024]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 1100.087027] env[60024]: DEBUG oslo_vmware.api [-] Fault list: [ManagedObjectNotFound] {{(pid=60024) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 1100.087342] env[60024]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ac9f10a2-3e73-4a20-a7df-437c49635450 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.093281] env[60024]: ERROR nova.virt.vmwareapi.driver [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Failed to detach None. Exception: The object 'vim.VirtualMachine:vm-894125' has already been deleted or has not been completely created [ 1100.093281] env[60024]: Cause: Server raised fault: 'The object 'vim.VirtualMachine:vm-894125' has already been deleted or has not been completely created' [ 1100.093281] env[60024]: Faults: [ManagedObjectNotFound] [ 1100.093281] env[60024]: Details: {'obj': 'vm-894125'}: oslo_vmware.exceptions.ManagedObjectNotFoundException: The object 'vim.VirtualMachine:vm-894125' has already been deleted or has not been completely created [ 1100.093745] env[60024]: WARNING nova.virt.vmwareapi.driver [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Instance does not exists. Proceeding to delete instance properties on datastore: oslo_vmware.exceptions.ManagedObjectNotFoundException: The object 'vim.VirtualMachine:vm-894125' has already been deleted or has not been completely created [ 1100.093840] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1100.094565] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2ae2ae7-8204-447d-9252-62dd8b532bdd {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.101298] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1100.101507] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-17f2bf91-916f-4ff6-9b7e-a3ac003c8542 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.163628] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1100.163846] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1100.164044] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Deleting the datastore file [datastore2] e259637a-0fc8-4368-8a7a-c15a134ed17d {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1100.164333] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-01845745-80f1-458b-ab3a-50fc38831161 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.172020] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Waiting for the task: (returnval){ [ 1100.172020] env[60024]: value = "task-4576319" [ 1100.172020] env[60024]: _type = "Task" [ 1100.172020] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1100.180465] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Task: {'id': task-4576319, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1100.682791] env[60024]: DEBUG oslo_vmware.api [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Task: {'id': task-4576319, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072204} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1100.683045] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1100.683231] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1100.683399] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1100.683575] env[60024]: INFO nova.compute.manager [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Took 0.67 seconds to destroy the instance on the hypervisor. [ 1100.697615] env[60024]: WARNING nova.volume.cinder [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Attachment 22944502-06d8-4caa-9d68-ae0dd4b27001 does not exist. Ignoring.: cinderclient.exceptions.NotFound: Volume attachment could not be found with filter: attachment_id = 22944502-06d8-4caa-9d68-ae0dd4b27001. (HTTP 404) (Request-ID: req-96abe75a-60de-4d65-b928-69a4804b807a) [ 1100.697908] env[60024]: INFO nova.compute.manager [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Took 0.01 seconds to detach 1 volumes for instance. [ 1100.700013] env[60024]: DEBUG nova.compute.claims [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1100.700203] env[60024]: DEBUG oslo_concurrency.lockutils [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1100.700427] env[60024]: DEBUG oslo_concurrency.lockutils [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1100.728280] env[60024]: DEBUG oslo_concurrency.lockutils [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1100.728965] env[60024]: DEBUG nova.compute.utils [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Instance e259637a-0fc8-4368-8a7a-c15a134ed17d could not be found. {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1100.730631] env[60024]: DEBUG nova.compute.manager [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Instance disappeared during build. {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1100.730802] env[60024]: DEBUG nova.compute.manager [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1100.730963] env[60024]: DEBUG nova.compute.manager [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1100.731143] env[60024]: DEBUG nova.compute.manager [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1100.731303] env[60024]: DEBUG nova.network.neutron [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1100.754139] env[60024]: DEBUG nova.network.neutron [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1100.762301] env[60024]: INFO nova.compute.manager [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Took 0.03 seconds to deallocate network for instance. [ 1100.806889] env[60024]: DEBUG oslo_concurrency.lockutils [None req-0ddcd069-9e43-4f7b-afad-2168625b0fa4 tempest-ServersTestBootFromVolume-728161428 tempest-ServersTestBootFromVolume-728161428-project-member] Lock "e259637a-0fc8-4368-8a7a-c15a134ed17d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.847s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1100.816350] env[60024]: DEBUG nova.compute.manager [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Starting instance... {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1100.861384] env[60024]: DEBUG oslo_concurrency.lockutils [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1100.861637] env[60024]: DEBUG oslo_concurrency.lockutils [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1100.864025] env[60024]: INFO nova.compute.claims [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1100.954094] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c5ca9ab-c73e-49a2-b53e-49e2fce3df68 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.962475] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ee70050-b20e-4793-8de5-a0f8c559ce6a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1100.992825] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-773e50a4-f346-4796-94fc-18e840e14240 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1101.001051] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-463967fa-0c0e-4bfa-9335-b02c4239fca7 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1101.015192] env[60024]: DEBUG nova.compute.provider_tree [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1101.026712] env[60024]: DEBUG nova.scheduler.client.report [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1101.040569] env[60024]: DEBUG oslo_concurrency.lockutils [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.179s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1101.041081] env[60024]: DEBUG nova.compute.manager [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Start building networks asynchronously for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1101.079420] env[60024]: DEBUG nova.compute.utils [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Using /dev/sd instead of None {{(pid=60024) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1101.080683] env[60024]: DEBUG nova.compute.manager [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Allocating IP information in the background. {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1101.080852] env[60024]: DEBUG nova.network.neutron [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] allocate_for_instance() {{(pid=60024) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1101.091229] env[60024]: DEBUG nova.compute.manager [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Start building block device mappings for instance. {{(pid=60024) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1101.140046] env[60024]: DEBUG nova.policy [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '023b6b895086429595d77b626450d972', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1ebb28978bff4e9a93497373d9a16e18', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=60024) authorize /opt/stack/nova/nova/policy.py:203}} [ 1101.155549] env[60024]: DEBUG nova.compute.manager [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Start spawning the instance on the hypervisor. {{(pid=60024) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1101.177905] env[60024]: DEBUG nova.virt.hardware [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-11-20T09:12:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-11-20T09:12:31Z,direct_url=,disk_format='vmdk',id=ce78d8ba-df84-4ce9-9b5e-632fda86b4cc,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='a8236fdd83234f75a229055fe16f088d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-11-20T09:12:32Z,virtual_size=,visibility=), allow threads: False {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1101.178173] env[60024]: DEBUG nova.virt.hardware [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Flavor limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1101.178332] env[60024]: DEBUG nova.virt.hardware [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Image limits 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1101.178512] env[60024]: DEBUG nova.virt.hardware [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Flavor pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1101.178659] env[60024]: DEBUG nova.virt.hardware [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Image pref 0:0:0 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1101.178806] env[60024]: DEBUG nova.virt.hardware [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=60024) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1101.179025] env[60024]: DEBUG nova.virt.hardware [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1101.179189] env[60024]: DEBUG nova.virt.hardware [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1101.179357] env[60024]: DEBUG nova.virt.hardware [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Got 1 possible topologies {{(pid=60024) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1101.179519] env[60024]: DEBUG nova.virt.hardware [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1101.179967] env[60024]: DEBUG nova.virt.hardware [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=60024) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1101.180575] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2507e8eb-de2e-4077-b539-2ddf7b6f3a55 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1101.189757] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bdc770e-90bd-4d1e-a097-57e01cb1914b {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1101.424641] env[60024]: DEBUG nova.network.neutron [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Successfully created port: cab434dd-5abd-453d-b107-7c888181bdbe {{(pid=60024) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1101.942186] env[60024]: DEBUG nova.compute.manager [req-6af15580-6e5b-45b8-8352-008c390febaf req-0c79f3c0-a469-4c68-b2b0-162f9fcfb98d service nova] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Received event network-vif-plugged-cab434dd-5abd-453d-b107-7c888181bdbe {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1101.942547] env[60024]: DEBUG oslo_concurrency.lockutils [req-6af15580-6e5b-45b8-8352-008c390febaf req-0c79f3c0-a469-4c68-b2b0-162f9fcfb98d service nova] Acquiring lock "8b64034a-4d67-4605-adb8-a007dd735230-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1101.942879] env[60024]: DEBUG oslo_concurrency.lockutils [req-6af15580-6e5b-45b8-8352-008c390febaf req-0c79f3c0-a469-4c68-b2b0-162f9fcfb98d service nova] Lock "8b64034a-4d67-4605-adb8-a007dd735230-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1101.943225] env[60024]: DEBUG oslo_concurrency.lockutils [req-6af15580-6e5b-45b8-8352-008c390febaf req-0c79f3c0-a469-4c68-b2b0-162f9fcfb98d service nova] Lock "8b64034a-4d67-4605-adb8-a007dd735230-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1101.943493] env[60024]: DEBUG nova.compute.manager [req-6af15580-6e5b-45b8-8352-008c390febaf req-0c79f3c0-a469-4c68-b2b0-162f9fcfb98d service nova] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] No waiting events found dispatching network-vif-plugged-cab434dd-5abd-453d-b107-7c888181bdbe {{(pid=60024) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1101.943730] env[60024]: WARNING nova.compute.manager [req-6af15580-6e5b-45b8-8352-008c390febaf req-0c79f3c0-a469-4c68-b2b0-162f9fcfb98d service nova] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Received unexpected event network-vif-plugged-cab434dd-5abd-453d-b107-7c888181bdbe for instance with vm_state building and task_state spawning. [ 1102.093854] env[60024]: DEBUG nova.network.neutron [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Successfully updated port: cab434dd-5abd-453d-b107-7c888181bdbe {{(pid=60024) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1102.107958] env[60024]: DEBUG oslo_concurrency.lockutils [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Acquiring lock "refresh_cache-8b64034a-4d67-4605-adb8-a007dd735230" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1102.108139] env[60024]: DEBUG oslo_concurrency.lockutils [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Acquired lock "refresh_cache-8b64034a-4d67-4605-adb8-a007dd735230" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1102.108296] env[60024]: DEBUG nova.network.neutron [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Building network info cache for instance {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1102.159369] env[60024]: DEBUG nova.network.neutron [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Instance cache missing network info. {{(pid=60024) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1102.334295] env[60024]: DEBUG nova.network.neutron [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Updating instance_info_cache with network_info: [{"id": "cab434dd-5abd-453d-b107-7c888181bdbe", "address": "fa:16:3e:a4:bc:6b", "network": {"id": "1e68f188-2948-4f00-af0f-73be9708c229", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1663167050-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1ebb28978bff4e9a93497373d9a16e18", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "abcf0d10-3f3f-45dc-923e-1c78766e2dad", "external-id": "nsx-vlan-transportzone-405", "segmentation_id": 405, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcab434dd-5a", "ovs_interfaceid": "cab434dd-5abd-453d-b107-7c888181bdbe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1102.346999] env[60024]: DEBUG oslo_concurrency.lockutils [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Releasing lock "refresh_cache-8b64034a-4d67-4605-adb8-a007dd735230" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1102.347395] env[60024]: DEBUG nova.compute.manager [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Instance network_info: |[{"id": "cab434dd-5abd-453d-b107-7c888181bdbe", "address": "fa:16:3e:a4:bc:6b", "network": {"id": "1e68f188-2948-4f00-af0f-73be9708c229", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1663167050-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1ebb28978bff4e9a93497373d9a16e18", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "abcf0d10-3f3f-45dc-923e-1c78766e2dad", "external-id": "nsx-vlan-transportzone-405", "segmentation_id": 405, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcab434dd-5a", "ovs_interfaceid": "cab434dd-5abd-453d-b107-7c888181bdbe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=60024) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1102.348023] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a4:bc:6b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'abcf0d10-3f3f-45dc-923e-1c78766e2dad', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'cab434dd-5abd-453d-b107-7c888181bdbe', 'vif_model': 'vmxnet3'}] {{(pid=60024) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1102.355683] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Creating folder: Project (1ebb28978bff4e9a93497373d9a16e18). Parent ref: group-v894073. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1102.356288] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-85898dd6-de25-4732-8e2e-c36b35930db5 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1102.369398] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Created folder: Project (1ebb28978bff4e9a93497373d9a16e18) in parent group-v894073. [ 1102.369615] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Creating folder: Instances. Parent ref: group-v894138. {{(pid=60024) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1102.370156] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bac59abe-d56f-467d-9c16-77ccaab03442 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1102.379908] env[60024]: INFO nova.virt.vmwareapi.vm_util [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Created folder: Instances in parent group-v894138. [ 1102.380255] env[60024]: DEBUG oslo.service.loopingcall [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1102.380468] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Creating VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1102.380720] env[60024]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a978ed2b-6277-4b9c-9708-2e4ded9d99b5 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1102.402022] env[60024]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1102.402022] env[60024]: value = "task-4576322" [ 1102.402022] env[60024]: _type = "Task" [ 1102.402022] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1102.410969] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576322, 'name': CreateVM_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1102.911613] env[60024]: DEBUG oslo_vmware.api [-] Task: {'id': task-4576322, 'name': CreateVM_Task, 'duration_secs': 0.31814} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1102.911790] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Created VM on the ESX host {{(pid=60024) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1102.912542] env[60024]: DEBUG oslo_concurrency.lockutils [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1102.912717] env[60024]: DEBUG oslo_concurrency.lockutils [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1102.913051] env[60024]: DEBUG oslo_concurrency.lockutils [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1102.913299] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f4890770-835e-4d87-85ab-784218c7a042 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1102.918243] env[60024]: DEBUG oslo_vmware.api [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Waiting for the task: (returnval){ [ 1102.918243] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]52689ccd-ac7d-bda8-0c42-8338e9c8d083" [ 1102.918243] env[60024]: _type = "Task" [ 1102.918243] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1102.926582] env[60024]: DEBUG oslo_vmware.api [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]52689ccd-ac7d-bda8-0c42-8338e9c8d083, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1103.431066] env[60024]: DEBUG oslo_concurrency.lockutils [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1103.431485] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Processing image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1103.431485] env[60024]: DEBUG oslo_concurrency.lockutils [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1103.969427] env[60024]: DEBUG nova.compute.manager [req-43103cef-1a0e-4541-9e87-d78fbad2c5cb req-70167ee5-c95d-4ce0-b584-9a30d80a12a1 service nova] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Received event network-changed-cab434dd-5abd-453d-b107-7c888181bdbe {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1103.969427] env[60024]: DEBUG nova.compute.manager [req-43103cef-1a0e-4541-9e87-d78fbad2c5cb req-70167ee5-c95d-4ce0-b584-9a30d80a12a1 service nova] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Refreshing instance network info cache due to event network-changed-cab434dd-5abd-453d-b107-7c888181bdbe. {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1103.969427] env[60024]: DEBUG oslo_concurrency.lockutils [req-43103cef-1a0e-4541-9e87-d78fbad2c5cb req-70167ee5-c95d-4ce0-b584-9a30d80a12a1 service nova] Acquiring lock "refresh_cache-8b64034a-4d67-4605-adb8-a007dd735230" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1103.969427] env[60024]: DEBUG oslo_concurrency.lockutils [req-43103cef-1a0e-4541-9e87-d78fbad2c5cb req-70167ee5-c95d-4ce0-b584-9a30d80a12a1 service nova] Acquired lock "refresh_cache-8b64034a-4d67-4605-adb8-a007dd735230" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1103.969574] env[60024]: DEBUG nova.network.neutron [req-43103cef-1a0e-4541-9e87-d78fbad2c5cb req-70167ee5-c95d-4ce0-b584-9a30d80a12a1 service nova] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Refreshing network info cache for port cab434dd-5abd-453d-b107-7c888181bdbe {{(pid=60024) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1104.471016] env[60024]: DEBUG nova.network.neutron [req-43103cef-1a0e-4541-9e87-d78fbad2c5cb req-70167ee5-c95d-4ce0-b584-9a30d80a12a1 service nova] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Updated VIF entry in instance network info cache for port cab434dd-5abd-453d-b107-7c888181bdbe. {{(pid=60024) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1104.471413] env[60024]: DEBUG nova.network.neutron [req-43103cef-1a0e-4541-9e87-d78fbad2c5cb req-70167ee5-c95d-4ce0-b584-9a30d80a12a1 service nova] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Updating instance_info_cache with network_info: [{"id": "cab434dd-5abd-453d-b107-7c888181bdbe", "address": "fa:16:3e:a4:bc:6b", "network": {"id": "1e68f188-2948-4f00-af0f-73be9708c229", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1663167050-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1ebb28978bff4e9a93497373d9a16e18", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "abcf0d10-3f3f-45dc-923e-1c78766e2dad", "external-id": "nsx-vlan-transportzone-405", "segmentation_id": 405, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapcab434dd-5a", "ovs_interfaceid": "cab434dd-5abd-453d-b107-7c888181bdbe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1104.480736] env[60024]: DEBUG oslo_concurrency.lockutils [req-43103cef-1a0e-4541-9e87-d78fbad2c5cb req-70167ee5-c95d-4ce0-b584-9a30d80a12a1 service nova] Releasing lock "refresh_cache-8b64034a-4d67-4605-adb8-a007dd735230" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1117.342337] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1117.342811] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Cleaning up deleted instances with incomplete migration {{(pid=60024) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 1120.347076] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1122.342252] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1122.342252] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1122.342252] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1122.352210] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1122.352439] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1122.352588] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1122.352744] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60024) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1122.353891] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc41d427-792f-42d7-b377-8277fc6ba0aa {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1122.363216] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b7099b4-dd20-48a4-a90c-ae87036778eb {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1122.378256] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42aa3fd8-a554-49ef-8645-8762a4b87550 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1122.385191] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1708c813-3738-48dd-acfa-05fa8e025c3f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1122.415255] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180619MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60024) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1122.415409] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1122.415624] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1122.543762] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 5888cc9f-7341-4f9c-a93c-dd5ec95f7369 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1122.543862] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 8b64034a-4d67-4605-adb8-a007dd735230 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1122.544016] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1122.544175] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=100GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1122.579855] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38092fc1-1cf8-47c3-91a0-7a2c4c07c1a9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1122.587688] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf680f64-7ecf-4f35-827c-d0aee5c8d84f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1122.617576] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b62054e9-578f-43dd-b074-02ce40a2b34e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1122.625082] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3957e62f-c6a0-456d-8b63-cf1ff51cf457 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1122.638255] env[60024]: DEBUG nova.compute.provider_tree [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1122.646292] env[60024]: DEBUG nova.scheduler.client.report [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1122.658725] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60024) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1122.658912] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.243s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1122.659166] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1123.347474] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1123.347847] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60024) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1123.347847] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1123.347927] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Cleaning up deleted instances {{(pid=60024) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 1123.373509] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] There are 9 instances to clean {{(pid=60024) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 1123.373769] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: e259637a-0fc8-4368-8a7a-c15a134ed17d] Instance has had 0 of 5 cleanup attempts {{(pid=60024) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1123.409523] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Instance has had 0 of 5 cleanup attempts {{(pid=60024) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1123.444230] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Instance has had 0 of 5 cleanup attempts {{(pid=60024) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1123.465706] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Instance has had 0 of 5 cleanup attempts {{(pid=60024) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1123.487687] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Instance has had 0 of 5 cleanup attempts {{(pid=60024) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1123.509984] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Instance has had 0 of 5 cleanup attempts {{(pid=60024) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1123.529590] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Instance has had 0 of 5 cleanup attempts {{(pid=60024) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1123.549300] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Instance has had 0 of 5 cleanup attempts {{(pid=60024) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1123.569717] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Instance has had 0 of 5 cleanup attempts {{(pid=60024) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1124.585063] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1125.337232] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1125.341613] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1125.341858] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Starting heal instance info cache {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1125.342085] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Rebuilding the list of instances to heal {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1125.353067] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1125.353222] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1125.353358] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Didn't find any instances for network info cache update. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1125.353810] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1125.353984] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1140.631649] env[60024]: WARNING oslo_vmware.rw_handles [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1140.631649] env[60024]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1140.631649] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1140.631649] env[60024]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1140.631649] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1140.631649] env[60024]: ERROR oslo_vmware.rw_handles response.begin() [ 1140.631649] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1140.631649] env[60024]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1140.631649] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1140.631649] env[60024]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1140.631649] env[60024]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1140.631649] env[60024]: ERROR oslo_vmware.rw_handles [ 1140.632511] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Downloaded image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to vmware_temp/20b33431-277a-4f8d-80cb-302aabe36840/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1140.633964] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Caching image {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1140.634221] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Copying Virtual Disk [datastore2] vmware_temp/20b33431-277a-4f8d-80cb-302aabe36840/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk to [datastore2] vmware_temp/20b33431-277a-4f8d-80cb-302aabe36840/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk {{(pid=60024) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1140.634500] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0c3df34c-4b6c-4c97-8a9a-803febdd1663 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1140.644448] env[60024]: DEBUG oslo_vmware.api [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Waiting for the task: (returnval){ [ 1140.644448] env[60024]: value = "task-4576323" [ 1140.644448] env[60024]: _type = "Task" [ 1140.644448] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1140.653053] env[60024]: DEBUG oslo_vmware.api [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Task: {'id': task-4576323, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1141.155861] env[60024]: DEBUG oslo_vmware.exceptions [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Fault InvalidArgument not matched. {{(pid=60024) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1141.156107] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1141.156620] env[60024]: ERROR nova.compute.manager [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1141.156620] env[60024]: Faults: ['InvalidArgument'] [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Traceback (most recent call last): [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] yield resources [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] self.driver.spawn(context, instance, image_meta, [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] self._fetch_image_if_missing(context, vi) [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] image_cache(vi, tmp_image_ds_loc) [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] vm_util.copy_virtual_disk( [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] session._wait_for_task(vmdk_copy_task) [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] return self.wait_for_task(task_ref) [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] return evt.wait() [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] result = hub.switch() [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] return self.greenlet.switch() [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] self.f(*self.args, **self.kw) [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] raise exceptions.translate_fault(task_info.error) [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Faults: ['InvalidArgument'] [ 1141.156620] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] [ 1141.157741] env[60024]: INFO nova.compute.manager [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Terminating instance [ 1141.158532] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1141.158740] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1141.158974] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1e7aab24-2cfd-45c8-96b4-8c57cd6e89d0 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.161292] env[60024]: DEBUG nova.compute.manager [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1141.161479] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1141.162208] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bde2396-4d94-4e17-9b4e-e68e8c52be03 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.169941] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1141.170200] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cefb095e-2107-432b-a0a7-91ea11971af4 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.172501] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1141.172685] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1141.173651] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-22e582b0-2473-4a44-8df6-5ce71d76def3 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.179591] env[60024]: DEBUG oslo_vmware.api [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Waiting for the task: (returnval){ [ 1141.179591] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]52061eab-da19-cf5f-9b7b-5d1410c376ef" [ 1141.179591] env[60024]: _type = "Task" [ 1141.179591] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1141.188026] env[60024]: DEBUG oslo_vmware.api [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]52061eab-da19-cf5f-9b7b-5d1410c376ef, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1141.242022] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1141.242255] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1141.242442] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Deleting the datastore file [datastore2] 5888cc9f-7341-4f9c-a93c-dd5ec95f7369 {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1141.242777] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e72b0b7c-1a65-49d1-9ab6-2daaf7622d79 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.250866] env[60024]: DEBUG oslo_vmware.api [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Waiting for the task: (returnval){ [ 1141.250866] env[60024]: value = "task-4576325" [ 1141.250866] env[60024]: _type = "Task" [ 1141.250866] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1141.260488] env[60024]: DEBUG oslo_vmware.api [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Task: {'id': task-4576325, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1141.691196] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1141.691580] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Creating directory with path [datastore2] vmware_temp/56aa9f20-6307-416e-87ce-37cfd288af34/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1141.691682] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-207a7aa0-0f39-4ae6-9222-692fb074bc0c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.703207] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Created directory with path [datastore2] vmware_temp/56aa9f20-6307-416e-87ce-37cfd288af34/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1141.703391] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Fetch image to [datastore2] vmware_temp/56aa9f20-6307-416e-87ce-37cfd288af34/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1141.703560] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/56aa9f20-6307-416e-87ce-37cfd288af34/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1141.704298] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21050939-8cd2-46a0-aad6-52fdfd0d369a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.711297] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3d9fbe0-738e-4188-89ea-41700d095722 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.720881] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e61f1540-21b0-4d16-80e2-03bfc79f2659 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.751863] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3278fb55-5f8f-4983-a515-26ee7bbcc1c9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.764147] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-262c4295-a41b-49c9-a64e-bcb7ec891c3b {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.765972] env[60024]: DEBUG oslo_vmware.api [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Task: {'id': task-4576325, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079764} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1141.766228] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1141.766410] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1141.766578] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1141.766748] env[60024]: INFO nova.compute.manager [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1141.768852] env[60024]: DEBUG nova.compute.claims [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1141.769027] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1141.769244] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1141.791944] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1141.845197] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-228886c5-053f-48cb-b952-435e007e05c1 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.853150] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e37d346-1521-43c9-909e-7f45064674bc {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.886576] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6decb445-ff35-4d02-b0d9-5a69c56bd547 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.895096] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87e36d52-1619-47c7-a5fe-c084dfbe830f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1141.909152] env[60024]: DEBUG nova.compute.provider_tree [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1141.917766] env[60024]: DEBUG nova.scheduler.client.report [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1141.932746] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.163s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1141.933293] env[60024]: ERROR nova.compute.manager [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1141.933293] env[60024]: Faults: ['InvalidArgument'] [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Traceback (most recent call last): [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] self.driver.spawn(context, instance, image_meta, [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] self._fetch_image_if_missing(context, vi) [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] image_cache(vi, tmp_image_ds_loc) [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] vm_util.copy_virtual_disk( [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] session._wait_for_task(vmdk_copy_task) [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] return self.wait_for_task(task_ref) [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] return evt.wait() [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] result = hub.switch() [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] return self.greenlet.switch() [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] self.f(*self.args, **self.kw) [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] raise exceptions.translate_fault(task_info.error) [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Faults: ['InvalidArgument'] [ 1141.933293] env[60024]: ERROR nova.compute.manager [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] [ 1141.934142] env[60024]: DEBUG nova.compute.utils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] VimFaultException {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1141.935601] env[60024]: DEBUG nova.compute.manager [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Build of instance 5888cc9f-7341-4f9c-a93c-dd5ec95f7369 was re-scheduled: A specified parameter was not correct: fileType [ 1141.935601] env[60024]: Faults: ['InvalidArgument'] {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1141.936013] env[60024]: DEBUG nova.compute.manager [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1141.936202] env[60024]: DEBUG nova.compute.manager [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1141.936358] env[60024]: DEBUG nova.compute.manager [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1141.936520] env[60024]: DEBUG nova.network.neutron [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1142.007524] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1142.008243] env[60024]: ERROR nova.compute.manager [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Traceback (most recent call last): [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] result = getattr(controller, method)(*args, **kwargs) [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return self._get(image_id) [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] resp, body = self.http_client.get(url, headers=header) [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return self.request(url, 'GET', **kwargs) [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return self._handle_response(resp) [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] raise exc.from_response(resp, resp.content) [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] During handling of the above exception, another exception occurred: [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Traceback (most recent call last): [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] yield resources [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] self.driver.spawn(context, instance, image_meta, [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1142.008243] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] self._fetch_image_if_missing(context, vi) [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] image_fetch(context, vi, tmp_image_ds_loc) [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] images.fetch_image( [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] metadata = IMAGE_API.get(context, image_ref) [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return session.show(context, image_id, [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] _reraise_translated_image_exception(image_id) [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] raise new_exc.with_traceback(exc_trace) [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] result = getattr(controller, method)(*args, **kwargs) [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return self._get(image_id) [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] resp, body = self.http_client.get(url, headers=header) [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return self.request(url, 'GET', **kwargs) [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return self._handle_response(resp) [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] raise exc.from_response(resp, resp.content) [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1142.009210] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.009210] env[60024]: INFO nova.compute.manager [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Terminating instance [ 1142.010516] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1142.010516] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1142.011080] env[60024]: DEBUG nova.compute.manager [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1142.011269] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1142.011494] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9e980b76-a4d8-45c1-8213-82adc976568a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.014043] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d910f51-d7e0-4c00-aaef-ce7c0ed391a8 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.029794] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1142.030179] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-fccbfd8f-e90f-44a3-8a3f-36acb905dfdb {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.039638] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1142.039842] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1142.040670] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9bc737dc-3201-47d9-821a-0ed2052d8f36 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.046355] env[60024]: DEBUG oslo_vmware.api [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Waiting for the task: (returnval){ [ 1142.046355] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]525492f5-af6f-b212-b607-237d9c2c57f9" [ 1142.046355] env[60024]: _type = "Task" [ 1142.046355] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1142.060760] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1142.061027] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Creating directory with path [datastore2] vmware_temp/d4aeae84-ce1d-44ee-ac0f-a54a444510cb/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1142.061280] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-003a5210-e3fa-4f8f-afeb-f9da23077393 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.088076] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Created directory with path [datastore2] vmware_temp/d4aeae84-ce1d-44ee-ac0f-a54a444510cb/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1142.088340] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Fetch image to [datastore2] vmware_temp/d4aeae84-ce1d-44ee-ac0f-a54a444510cb/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1142.088570] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/d4aeae84-ce1d-44ee-ac0f-a54a444510cb/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1142.089456] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1c16bf0-6431-40fe-b5ab-f33334409eff {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.099831] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1476c255-77b1-4810-84d2-b6838b0fa67d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.104239] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1142.104589] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1142.104814] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Deleting the datastore file [datastore2] 08e2d758-9005-4822-b157-84710b9c5ed4 {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1142.105429] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0ad02dfa-734c-4df1-a6eb-33e1594523b3 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.112816] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-217bd2ac-b6da-41b8-bb5d-c6d8769a212e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.117675] env[60024]: DEBUG oslo_vmware.api [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Waiting for the task: (returnval){ [ 1142.117675] env[60024]: value = "task-4576327" [ 1142.117675] env[60024]: _type = "Task" [ 1142.117675] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1142.147047] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a825242-7ecf-4c05-a882-1b66fd4447b0 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.152365] env[60024]: DEBUG oslo_vmware.api [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Task: {'id': task-4576327, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1142.156755] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c3a09cfe-7fc2-4bee-ae6e-c1bb9026ffc2 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.183462] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1142.291263] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1142.292075] env[60024]: ERROR nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Traceback (most recent call last): [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] result = getattr(controller, method)(*args, **kwargs) [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return self._get(image_id) [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] resp, body = self.http_client.get(url, headers=header) [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return self.request(url, 'GET', **kwargs) [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return self._handle_response(resp) [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] raise exc.from_response(resp, resp.content) [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] During handling of the above exception, another exception occurred: [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Traceback (most recent call last): [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] yield resources [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] self.driver.spawn(context, instance, image_meta, [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1142.292075] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] self._fetch_image_if_missing(context, vi) [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] image_fetch(context, vi, tmp_image_ds_loc) [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] images.fetch_image( [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] metadata = IMAGE_API.get(context, image_ref) [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return session.show(context, image_id, [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] _reraise_translated_image_exception(image_id) [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] raise new_exc.with_traceback(exc_trace) [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] result = getattr(controller, method)(*args, **kwargs) [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return self._get(image_id) [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] resp, body = self.http_client.get(url, headers=header) [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return self.request(url, 'GET', **kwargs) [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return self._handle_response(resp) [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] raise exc.from_response(resp, resp.content) [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1142.293427] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1142.293427] env[60024]: INFO nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Terminating instance [ 1142.295112] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1142.295112] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1142.296166] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1142.296360] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1142.300027] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8b3ac7fa-1736-4372-8e3d-48afb87dc115 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.300027] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-226fb010-b5c5-4e09-809e-180d123ff37f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.304585] env[60024]: DEBUG nova.network.neutron [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1142.308868] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1142.309154] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0e348824-9417-4c57-be5f-a9ff6e1fcc4d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.312545] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1142.312545] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1142.313139] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8ef4dda5-4854-4b54-84d7-8c351db7ed6e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.319907] env[60024]: DEBUG oslo_vmware.api [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Waiting for the task: (returnval){ [ 1142.319907] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]52599dac-3ef4-ed4e-a964-042e1a5db791" [ 1142.319907] env[60024]: _type = "Task" [ 1142.319907] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1142.325634] env[60024]: INFO nova.compute.manager [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Took 0.39 seconds to deallocate network for instance. [ 1142.336488] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1142.336488] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Creating directory with path [datastore2] vmware_temp/66819bd6-c311-448f-bd8a-d87303da7e14/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1142.337024] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-60625e55-8d24-439d-a11b-e9e14888fa36 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.353068] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Created directory with path [datastore2] vmware_temp/66819bd6-c311-448f-bd8a-d87303da7e14/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1142.353463] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Fetch image to [datastore2] vmware_temp/66819bd6-c311-448f-bd8a-d87303da7e14/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1142.353463] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/66819bd6-c311-448f-bd8a-d87303da7e14/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1142.354319] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-345c46c1-9aa4-499f-a040-eb4f5d4f5035 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.362797] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58ef4a73-25e3-4717-bf6f-1bf89f6958be {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.377909] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dcb5d17-6c0a-44ee-aee9-7f28a993fef7 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.420098] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca3f8728-791f-437d-a6d4-78814c2cfa38 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.423087] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1142.423299] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1142.423474] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Deleting the datastore file [datastore2] 54919bf0-b9f3-4bfc-ba1a-c6a52013e351 {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1142.423951] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e2cc34d7-3dc5-4fd1-abc7-48b10bfc4b51 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.430163] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d6640633-90fd-4eb8-ac4c-0156f7ad568b {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.433361] env[60024]: DEBUG oslo_vmware.api [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Waiting for the task: (returnval){ [ 1142.433361] env[60024]: value = "task-4576329" [ 1142.433361] env[60024]: _type = "Task" [ 1142.433361] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1142.443252] env[60024]: DEBUG oslo_vmware.api [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Task: {'id': task-4576329, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1142.443961] env[60024]: INFO nova.scheduler.client.report [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Deleted allocations for instance 5888cc9f-7341-4f9c-a93c-dd5ec95f7369 [ 1142.454943] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1142.464438] env[60024]: DEBUG oslo_concurrency.lockutils [None req-68696909-aa75-4998-bb13-0c63cf192b9e tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Lock "5888cc9f-7341-4f9c-a93c-dd5ec95f7369" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 558.709s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1142.464721] env[60024]: DEBUG oslo_concurrency.lockutils [None req-4d962258-7b79-4028-86dc-e7e389009bc0 tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Lock "5888cc9f-7341-4f9c-a93c-dd5ec95f7369" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 361.326s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1142.464943] env[60024]: DEBUG oslo_concurrency.lockutils [None req-4d962258-7b79-4028-86dc-e7e389009bc0 tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Acquiring lock "5888cc9f-7341-4f9c-a93c-dd5ec95f7369-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1142.465173] env[60024]: DEBUG oslo_concurrency.lockutils [None req-4d962258-7b79-4028-86dc-e7e389009bc0 tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Lock "5888cc9f-7341-4f9c-a93c-dd5ec95f7369-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1142.465343] env[60024]: DEBUG oslo_concurrency.lockutils [None req-4d962258-7b79-4028-86dc-e7e389009bc0 tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Lock "5888cc9f-7341-4f9c-a93c-dd5ec95f7369-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1142.467411] env[60024]: INFO nova.compute.manager [None req-4d962258-7b79-4028-86dc-e7e389009bc0 tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Terminating instance [ 1142.469179] env[60024]: DEBUG nova.compute.manager [None req-4d962258-7b79-4028-86dc-e7e389009bc0 tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1142.469374] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-4d962258-7b79-4028-86dc-e7e389009bc0 tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1142.469813] env[60024]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-93fb1b81-fa7b-44d6-9e98-442ee224bab2 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.480224] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83107505-0e56-49a0-9464-3e3f0d928238 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.514743] env[60024]: WARNING nova.virt.vmwareapi.vmops [None req-4d962258-7b79-4028-86dc-e7e389009bc0 tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5888cc9f-7341-4f9c-a93c-dd5ec95f7369 could not be found. [ 1142.514743] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-4d962258-7b79-4028-86dc-e7e389009bc0 tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1142.515209] env[60024]: INFO nova.compute.manager [None req-4d962258-7b79-4028-86dc-e7e389009bc0 tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1142.515498] env[60024]: DEBUG oslo.service.loopingcall [None req-4d962258-7b79-4028-86dc-e7e389009bc0 tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=60024) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1142.515850] env[60024]: DEBUG nova.compute.manager [-] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1142.515991] env[60024]: DEBUG nova.network.neutron [-] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1142.573169] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1142.573797] env[60024]: ERROR nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Traceback (most recent call last): [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] result = getattr(controller, method)(*args, **kwargs) [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return self._get(image_id) [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] resp, body = self.http_client.get(url, headers=header) [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return self.request(url, 'GET', **kwargs) [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return self._handle_response(resp) [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] raise exc.from_response(resp, resp.content) [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] During handling of the above exception, another exception occurred: [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Traceback (most recent call last): [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] yield resources [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] self.driver.spawn(context, instance, image_meta, [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1142.573797] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] self._fetch_image_if_missing(context, vi) [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] image_fetch(context, vi, tmp_image_ds_loc) [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] images.fetch_image( [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] metadata = IMAGE_API.get(context, image_ref) [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return session.show(context, image_id, [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] _reraise_translated_image_exception(image_id) [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] raise new_exc.with_traceback(exc_trace) [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] result = getattr(controller, method)(*args, **kwargs) [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return self._get(image_id) [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] resp, body = self.http_client.get(url, headers=header) [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return self.request(url, 'GET', **kwargs) [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return self._handle_response(resp) [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] raise exc.from_response(resp, resp.content) [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1142.576188] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1142.576188] env[60024]: INFO nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Terminating instance [ 1142.577886] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1142.577886] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1142.577886] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1142.577886] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1142.577886] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-dccaf8b1-18bf-43b7-ad56-5101de19e4bc {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.579306] env[60024]: DEBUG nova.network.neutron [-] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1142.580990] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-994dae11-8f88-43f9-8856-35284c0ea3bd {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.589318] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1142.589988] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e40cc012-247f-4678-a5ae-47aba6a444fa {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.591690] env[60024]: INFO nova.compute.manager [-] [instance: 5888cc9f-7341-4f9c-a93c-dd5ec95f7369] Took 0.08 seconds to deallocate network for instance. [ 1142.594208] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1142.594376] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1142.597778] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c1b9b285-d988-41e8-8803-df151c032c4f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.603831] env[60024]: DEBUG oslo_vmware.api [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Waiting for the task: (returnval){ [ 1142.603831] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]522a1431-b462-9987-b634-68fbd9a486ca" [ 1142.603831] env[60024]: _type = "Task" [ 1142.603831] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1142.612275] env[60024]: DEBUG oslo_vmware.api [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]522a1431-b462-9987-b634-68fbd9a486ca, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1142.627455] env[60024]: DEBUG oslo_vmware.api [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Task: {'id': task-4576327, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068176} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1142.628408] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1142.628600] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1142.628774] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1142.628946] env[60024]: INFO nova.compute.manager [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1142.633160] env[60024]: DEBUG nova.compute.claims [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1142.633343] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1142.633555] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1142.659331] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1142.660128] env[60024]: DEBUG nova.compute.utils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Instance 08e2d758-9005-4822-b157-84710b9c5ed4 could not be found. {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1142.662667] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1142.662894] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1142.663096] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Deleting the datastore file [datastore2] a925d5fc-6437-40bb-adf1-ea10c32dde2a {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1142.663586] env[60024]: DEBUG nova.compute.manager [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Instance disappeared during build. {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1142.663749] env[60024]: DEBUG nova.compute.manager [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1142.663911] env[60024]: DEBUG nova.compute.manager [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1142.664088] env[60024]: DEBUG nova.compute.manager [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1142.664250] env[60024]: DEBUG nova.network.neutron [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1142.665997] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-41f14d58-b821-46e5-9869-52d4ba7a1061 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1142.673879] env[60024]: DEBUG oslo_vmware.api [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Waiting for the task: (returnval){ [ 1142.673879] env[60024]: value = "task-4576331" [ 1142.673879] env[60024]: _type = "Task" [ 1142.673879] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1142.684240] env[60024]: DEBUG oslo_vmware.api [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Task: {'id': task-4576331, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1142.688039] env[60024]: DEBUG oslo_concurrency.lockutils [None req-4d962258-7b79-4028-86dc-e7e389009bc0 tempest-TenantUsagesTestJSON-787131523 tempest-TenantUsagesTestJSON-787131523-project-member] Lock "5888cc9f-7341-4f9c-a93c-dd5ec95f7369" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.223s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1142.767731] env[60024]: DEBUG neutronclient.v2_0.client [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60024) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1142.771733] env[60024]: ERROR nova.compute.manager [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Traceback (most recent call last): [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] result = getattr(controller, method)(*args, **kwargs) [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return self._get(image_id) [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] resp, body = self.http_client.get(url, headers=header) [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return self.request(url, 'GET', **kwargs) [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return self._handle_response(resp) [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] raise exc.from_response(resp, resp.content) [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] During handling of the above exception, another exception occurred: [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Traceback (most recent call last): [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] self.driver.spawn(context, instance, image_meta, [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] self._fetch_image_if_missing(context, vi) [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1142.771733] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] image_fetch(context, vi, tmp_image_ds_loc) [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] images.fetch_image( [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] metadata = IMAGE_API.get(context, image_ref) [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return session.show(context, image_id, [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] _reraise_translated_image_exception(image_id) [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] raise new_exc.with_traceback(exc_trace) [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] result = getattr(controller, method)(*args, **kwargs) [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return self._get(image_id) [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] resp, body = self.http_client.get(url, headers=header) [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return self.request(url, 'GET', **kwargs) [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return self._handle_response(resp) [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] raise exc.from_response(resp, resp.content) [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] During handling of the above exception, another exception occurred: [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Traceback (most recent call last): [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] self._build_and_run_instance(context, instance, image, [ 1142.772687] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] with excutils.save_and_reraise_exception(): [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] self.force_reraise() [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] raise self.value [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] with self.rt.instance_claim(context, instance, node, allocs, [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] self.abort() [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return f(*args, **kwargs) [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] self._unset_instance_host_and_node(instance) [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] instance.save() [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] updates, result = self.indirection_api.object_action( [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return cctxt.call(context, 'object_action', objinst=objinst, [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] result = self.transport._send( [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return self._driver.send(target, ctxt, message, [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] raise result [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] nova.exception_Remote.InstanceNotFound_Remote: Instance 08e2d758-9005-4822-b157-84710b9c5ed4 could not be found. [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Traceback (most recent call last): [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1142.773683] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return getattr(target, method)(*args, **kwargs) [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return fn(self, *args, **kwargs) [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] old_ref, inst_ref = db.instance_update_and_get_original( [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return f(*args, **kwargs) [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] with excutils.save_and_reraise_exception() as ectxt: [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] self.force_reraise() [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] raise self.value [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return f(*args, **kwargs) [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return f(context, *args, **kwargs) [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] raise exception.InstanceNotFound(instance_id=uuid) [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] nova.exception.InstanceNotFound: Instance 08e2d758-9005-4822-b157-84710b9c5ed4 could not be found. [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] During handling of the above exception, another exception occurred: [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Traceback (most recent call last): [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] ret = obj(*args, **kwargs) [ 1142.774736] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] exception_handler_v20(status_code, error_body) [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] raise client_exc(message=error_message, [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Neutron server returns request_ids: ['req-abbb08ca-5a2f-4842-917d-cd3b01d29d21'] [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] During handling of the above exception, another exception occurred: [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] Traceback (most recent call last): [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] self._deallocate_network(context, instance, requested_networks) [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] self.network_api.deallocate_for_instance( [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] data = neutron.list_ports(**search_opts) [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] ret = obj(*args, **kwargs) [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return self.list('ports', self.ports_path, retrieve_all, [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] ret = obj(*args, **kwargs) [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] for r in self._pagination(collection, path, **params): [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] res = self.get(path, params=params) [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] ret = obj(*args, **kwargs) [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return self.retry_request("GET", action, body=body, [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] ret = obj(*args, **kwargs) [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] return self.do_request(method, action, body=body, [ 1142.775930] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1142.776981] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] ret = obj(*args, **kwargs) [ 1142.776981] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1142.776981] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] self._handle_fault_response(status_code, replybody, resp) [ 1142.776981] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1142.776981] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] raise exception.Unauthorized() [ 1142.776981] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] nova.exception.Unauthorized: Not authorized. [ 1142.776981] env[60024]: ERROR nova.compute.manager [instance: 08e2d758-9005-4822-b157-84710b9c5ed4] [ 1142.794702] env[60024]: DEBUG oslo_concurrency.lockutils [None req-30b24ba5-531e-443c-a69b-a353e14a01d7 tempest-ServerAddressesTestJSON-1201308295 tempest-ServerAddressesTestJSON-1201308295-project-member] Lock "08e2d758-9005-4822-b157-84710b9c5ed4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 491.302s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1142.944977] env[60024]: DEBUG oslo_vmware.api [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Task: {'id': task-4576329, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070321} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1142.945328] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1142.945519] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1142.945695] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1142.945869] env[60024]: INFO nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Took 0.65 seconds to destroy the instance on the hypervisor. [ 1142.947989] env[60024]: DEBUG nova.compute.claims [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1142.948175] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1142.948389] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1142.973617] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1142.974356] env[60024]: DEBUG nova.compute.utils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Instance 54919bf0-b9f3-4bfc-ba1a-c6a52013e351 could not be found. {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1142.976042] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Instance disappeared during build. {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1142.976303] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1142.976513] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1142.976692] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1142.976854] env[60024]: DEBUG nova.network.neutron [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1143.080925] env[60024]: DEBUG neutronclient.v2_0.client [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60024) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1143.082514] env[60024]: ERROR nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Traceback (most recent call last): [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] result = getattr(controller, method)(*args, **kwargs) [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return self._get(image_id) [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] resp, body = self.http_client.get(url, headers=header) [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return self.request(url, 'GET', **kwargs) [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return self._handle_response(resp) [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] raise exc.from_response(resp, resp.content) [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] During handling of the above exception, another exception occurred: [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Traceback (most recent call last): [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] self.driver.spawn(context, instance, image_meta, [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] self._fetch_image_if_missing(context, vi) [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1143.082514] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] image_fetch(context, vi, tmp_image_ds_loc) [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] images.fetch_image( [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] metadata = IMAGE_API.get(context, image_ref) [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return session.show(context, image_id, [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] _reraise_translated_image_exception(image_id) [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] raise new_exc.with_traceback(exc_trace) [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] result = getattr(controller, method)(*args, **kwargs) [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return self._get(image_id) [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] resp, body = self.http_client.get(url, headers=header) [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return self.request(url, 'GET', **kwargs) [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return self._handle_response(resp) [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] raise exc.from_response(resp, resp.content) [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] During handling of the above exception, another exception occurred: [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Traceback (most recent call last): [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] self._build_and_run_instance(context, instance, image, [ 1143.084037] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] with excutils.save_and_reraise_exception(): [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] self.force_reraise() [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] raise self.value [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] with self.rt.instance_claim(context, instance, node, allocs, [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] self.abort() [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return f(*args, **kwargs) [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] self._unset_instance_host_and_node(instance) [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] instance.save() [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] updates, result = self.indirection_api.object_action( [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return cctxt.call(context, 'object_action', objinst=objinst, [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] result = self.transport._send( [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return self._driver.send(target, ctxt, message, [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] raise result [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] nova.exception_Remote.InstanceNotFound_Remote: Instance 54919bf0-b9f3-4bfc-ba1a-c6a52013e351 could not be found. [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Traceback (most recent call last): [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1143.085604] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return getattr(target, method)(*args, **kwargs) [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return fn(self, *args, **kwargs) [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] old_ref, inst_ref = db.instance_update_and_get_original( [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return f(*args, **kwargs) [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] with excutils.save_and_reraise_exception() as ectxt: [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] self.force_reraise() [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] raise self.value [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return f(*args, **kwargs) [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return f(context, *args, **kwargs) [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] raise exception.InstanceNotFound(instance_id=uuid) [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] nova.exception.InstanceNotFound: Instance 54919bf0-b9f3-4bfc-ba1a-c6a52013e351 could not be found. [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] During handling of the above exception, another exception occurred: [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Traceback (most recent call last): [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] ret = obj(*args, **kwargs) [ 1143.087229] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] exception_handler_v20(status_code, error_body) [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] raise client_exc(message=error_message, [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Neutron server returns request_ids: ['req-456b6ff0-1870-4e92-ab14-bbd87f67dbff'] [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] During handling of the above exception, another exception occurred: [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] Traceback (most recent call last): [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] self._deallocate_network(context, instance, requested_networks) [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] self.network_api.deallocate_for_instance( [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] data = neutron.list_ports(**search_opts) [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] ret = obj(*args, **kwargs) [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return self.list('ports', self.ports_path, retrieve_all, [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] ret = obj(*args, **kwargs) [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] for r in self._pagination(collection, path, **params): [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] res = self.get(path, params=params) [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] ret = obj(*args, **kwargs) [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return self.retry_request("GET", action, body=body, [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] ret = obj(*args, **kwargs) [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] return self.do_request(method, action, body=body, [ 1143.089353] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1143.090937] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] ret = obj(*args, **kwargs) [ 1143.090937] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1143.090937] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] self._handle_fault_response(status_code, replybody, resp) [ 1143.090937] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1143.090937] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] raise exception.Unauthorized() [ 1143.090937] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] nova.exception.Unauthorized: Not authorized. [ 1143.090937] env[60024]: ERROR nova.compute.manager [instance: 54919bf0-b9f3-4bfc-ba1a-c6a52013e351] [ 1143.104645] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Lock "54919bf0-b9f3-4bfc-ba1a-c6a52013e351" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 490.092s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1143.114489] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1143.114784] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Creating directory with path [datastore2] vmware_temp/ecd8e4ea-d0cd-4739-a518-78e85ec3f712/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1143.115028] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-48b542b9-8318-4fd8-9dc4-9a687b72a21b {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1143.126848] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Created directory with path [datastore2] vmware_temp/ecd8e4ea-d0cd-4739-a518-78e85ec3f712/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1143.127060] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Fetch image to [datastore2] vmware_temp/ecd8e4ea-d0cd-4739-a518-78e85ec3f712/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1143.127235] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/ecd8e4ea-d0cd-4739-a518-78e85ec3f712/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1143.127963] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-373ba938-5fdd-45ef-b78f-c2fef18213ac {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1143.135279] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a824adfe-1513-433e-b57d-6d8e4449e0e7 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1143.144502] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49cacdd0-7c0d-4d2d-a951-450e41e7ea11 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1143.178132] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b053befa-9d24-4efc-a965-a83712bc720a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1143.187924] env[60024]: DEBUG oslo_vmware.api [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Task: {'id': task-4576331, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07587} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1143.189453] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1143.189647] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1143.189817] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1143.189993] env[60024]: INFO nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1143.191897] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-34f2ac1d-12ea-4f1d-b572-79e0f94aa392 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1143.194045] env[60024]: DEBUG nova.compute.claims [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1143.194234] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1143.194451] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1143.217783] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1143.220723] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1143.221453] env[60024]: DEBUG nova.compute.utils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Instance a925d5fc-6437-40bb-adf1-ea10c32dde2a could not be found. {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1143.222831] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Instance disappeared during build. {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1143.223017] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1143.223180] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1143.223344] env[60024]: DEBUG nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1143.223499] env[60024]: DEBUG nova.network.neutron [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1143.315976] env[60024]: DEBUG neutronclient.v2_0.client [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60024) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1143.317514] env[60024]: ERROR nova.compute.manager [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Traceback (most recent call last): [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] result = getattr(controller, method)(*args, **kwargs) [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return self._get(image_id) [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] resp, body = self.http_client.get(url, headers=header) [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return self.request(url, 'GET', **kwargs) [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return self._handle_response(resp) [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] raise exc.from_response(resp, resp.content) [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] During handling of the above exception, another exception occurred: [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Traceback (most recent call last): [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] self.driver.spawn(context, instance, image_meta, [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] self._fetch_image_if_missing(context, vi) [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1143.317514] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] image_fetch(context, vi, tmp_image_ds_loc) [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] images.fetch_image( [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] metadata = IMAGE_API.get(context, image_ref) [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return session.show(context, image_id, [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] _reraise_translated_image_exception(image_id) [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] raise new_exc.with_traceback(exc_trace) [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] result = getattr(controller, method)(*args, **kwargs) [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return self._get(image_id) [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] resp, body = self.http_client.get(url, headers=header) [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return self.request(url, 'GET', **kwargs) [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return self._handle_response(resp) [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] raise exc.from_response(resp, resp.content) [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] During handling of the above exception, another exception occurred: [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Traceback (most recent call last): [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] self._build_and_run_instance(context, instance, image, [ 1143.318442] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] with excutils.save_and_reraise_exception(): [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] self.force_reraise() [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] raise self.value [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] with self.rt.instance_claim(context, instance, node, allocs, [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] self.abort() [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return f(*args, **kwargs) [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] self._unset_instance_host_and_node(instance) [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] instance.save() [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] updates, result = self.indirection_api.object_action( [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return cctxt.call(context, 'object_action', objinst=objinst, [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] result = self.transport._send( [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return self._driver.send(target, ctxt, message, [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] raise result [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] nova.exception_Remote.InstanceNotFound_Remote: Instance a925d5fc-6437-40bb-adf1-ea10c32dde2a could not be found. [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Traceback (most recent call last): [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1143.319345] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return getattr(target, method)(*args, **kwargs) [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return fn(self, *args, **kwargs) [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] old_ref, inst_ref = db.instance_update_and_get_original( [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return f(*args, **kwargs) [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] with excutils.save_and_reraise_exception() as ectxt: [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] self.force_reraise() [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] raise self.value [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return f(*args, **kwargs) [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return f(context, *args, **kwargs) [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] raise exception.InstanceNotFound(instance_id=uuid) [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] nova.exception.InstanceNotFound: Instance a925d5fc-6437-40bb-adf1-ea10c32dde2a could not be found. [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] During handling of the above exception, another exception occurred: [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Traceback (most recent call last): [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] ret = obj(*args, **kwargs) [ 1143.320374] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] exception_handler_v20(status_code, error_body) [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] raise client_exc(message=error_message, [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Neutron server returns request_ids: ['req-758e9025-586f-495b-9e3c-fd7bd29a02d0'] [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] During handling of the above exception, another exception occurred: [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] Traceback (most recent call last): [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] self._deallocate_network(context, instance, requested_networks) [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] self.network_api.deallocate_for_instance( [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] data = neutron.list_ports(**search_opts) [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] ret = obj(*args, **kwargs) [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return self.list('ports', self.ports_path, retrieve_all, [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] ret = obj(*args, **kwargs) [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] for r in self._pagination(collection, path, **params): [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] res = self.get(path, params=params) [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] ret = obj(*args, **kwargs) [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return self.retry_request("GET", action, body=body, [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] ret = obj(*args, **kwargs) [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] return self.do_request(method, action, body=body, [ 1143.321691] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1143.322751] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] ret = obj(*args, **kwargs) [ 1143.322751] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1143.322751] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] self._handle_fault_response(status_code, replybody, resp) [ 1143.322751] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1143.322751] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] raise exception.Unauthorized() [ 1143.322751] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] nova.exception.Unauthorized: Not authorized. [ 1143.322751] env[60024]: ERROR nova.compute.manager [instance: a925d5fc-6437-40bb-adf1-ea10c32dde2a] [ 1143.343416] env[60024]: DEBUG oslo_concurrency.lockutils [None req-486ce0a8-156c-48a6-abf8-e5e7d3003872 tempest-MultipleCreateTestJSON-328947127 tempest-MultipleCreateTestJSON-328947127-project-member] Lock "a925d5fc-6437-40bb-adf1-ea10c32dde2a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 490.300s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1143.346966] env[60024]: DEBUG oslo_vmware.rw_handles [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ecd8e4ea-d0cd-4739-a518-78e85ec3f712/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1143.405651] env[60024]: DEBUG oslo_vmware.rw_handles [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Completed reading data from the image iterator. {{(pid=60024) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1143.405862] env[60024]: DEBUG oslo_vmware.rw_handles [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ecd8e4ea-d0cd-4739-a518-78e85ec3f712/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1143.775756] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._sync_power_states {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1143.788329] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Getting list of instances from cluster (obj){ [ 1143.788329] env[60024]: value = "domain-c8" [ 1143.788329] env[60024]: _type = "ClusterComputeResource" [ 1143.788329] env[60024]: } {{(pid=60024) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1143.789632] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b7fd3dd-b99f-4eee-810a-c32bfb9a5cc1 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1143.804753] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Got total of 6 instances {{(pid=60024) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1143.804960] env[60024]: WARNING nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] While synchronizing instance power states, found 1 instances in the database and 6 instances on the hypervisor. [ 1143.805123] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Triggering sync for uuid 8b64034a-4d67-4605-adb8-a007dd735230 {{(pid=60024) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10224}} [ 1143.805495] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "8b64034a-4d67-4605-adb8-a007dd735230" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1183.372058] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1184.341875] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1184.342119] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1184.342289] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1184.342439] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=60024) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1184.342588] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1184.353434] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1184.353660] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1184.353829] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1184.353990] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=60024) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1184.355158] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73cf1453-ae49-4968-91d3-592d3988ff0d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.364909] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de27c2b1-7648-4033-9d5a-8b89e52e6911 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.380018] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c376d66c-9ac0-46fe-a579-8b516a45d886 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.387888] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44b5a358-ceff-4a4d-8e84-8e9379e03dee {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.420013] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180689MB free_disk=97GB free_vcpus=48 pci_devices=None {{(pid=60024) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1184.420211] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1184.420423] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1184.461597] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Instance 8b64034a-4d67-4605-adb8-a007dd735230 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=60024) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1184.461892] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1184.462096] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=100GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=60024) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1184.478849] env[60024]: DEBUG nova.scheduler.client.report [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Refreshing inventories for resource provider 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1184.492756] env[60024]: DEBUG nova.scheduler.client.report [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Updating ProviderTree inventory for provider 5b70561f-4086-4d22-a0b6-aa1035435329 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1184.494280] env[60024]: DEBUG nova.compute.provider_tree [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Updating inventory in ProviderTree for provider 5b70561f-4086-4d22-a0b6-aa1035435329 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1184.511190] env[60024]: DEBUG nova.scheduler.client.report [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Refreshing aggregate associations for resource provider 5b70561f-4086-4d22-a0b6-aa1035435329, aggregates: None {{(pid=60024) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1184.528270] env[60024]: DEBUG nova.scheduler.client.report [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Refreshing trait associations for resource provider 5b70561f-4086-4d22-a0b6-aa1035435329, traits: COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NODE {{(pid=60024) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1184.556537] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06709ab3-e014-43f3-864f-bfa19e25fbd8 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.564755] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e216cfb4-8024-4d12-a56b-5c8bf90bdca9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.595812] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a0fac47-da81-4262-97e1-a13604a01fd8 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.604253] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3e3f755-a700-4205-928f-4a6a814e1e40 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1184.618584] env[60024]: DEBUG nova.compute.provider_tree [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed in ProviderTree for provider: 5b70561f-4086-4d22-a0b6-aa1035435329 {{(pid=60024) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1184.627328] env[60024]: DEBUG nova.scheduler.client.report [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Inventory has not changed for provider 5b70561f-4086-4d22-a0b6-aa1035435329 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 200, 'reserved': 0, 'min_unit': 1, 'max_unit': 97, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=60024) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1184.640849] env[60024]: DEBUG nova.compute.resource_tracker [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=60024) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1184.641061] env[60024]: DEBUG oslo_concurrency.lockutils [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1185.635978] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1187.341498] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1187.341973] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Starting heal instance info cache {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1187.341973] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Rebuilding the list of instances to heal {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1187.352071] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Skipping network cache update for instance because it is Building. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1187.352249] env[60024]: DEBUG nova.compute.manager [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Didn't find any instances for network info cache update. {{(pid=60024) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1187.352455] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1187.352848] env[60024]: DEBUG oslo_service.periodic_task [None req-6e946a11-c4c7-4923-a0f4-90836c546fd3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=60024) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1191.572207] env[60024]: WARNING oslo_vmware.rw_handles [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1191.572207] env[60024]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1191.572207] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1191.572207] env[60024]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1191.572207] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1191.572207] env[60024]: ERROR oslo_vmware.rw_handles response.begin() [ 1191.572207] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1191.572207] env[60024]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1191.572207] env[60024]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1191.572207] env[60024]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1191.572207] env[60024]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1191.572207] env[60024]: ERROR oslo_vmware.rw_handles [ 1191.573031] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Downloaded image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to vmware_temp/ecd8e4ea-d0cd-4739-a518-78e85ec3f712/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1191.575187] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Caching image {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1191.575369] env[60024]: DEBUG nova.virt.vmwareapi.vm_util [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Copying Virtual Disk [datastore2] vmware_temp/ecd8e4ea-d0cd-4739-a518-78e85ec3f712/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk to [datastore2] vmware_temp/ecd8e4ea-d0cd-4739-a518-78e85ec3f712/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk {{(pid=60024) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1191.575584] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ba249bf1-13df-4cdc-ae88-754a54cfdea2 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1191.586175] env[60024]: DEBUG oslo_vmware.api [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Waiting for the task: (returnval){ [ 1191.586175] env[60024]: value = "task-4576332" [ 1191.586175] env[60024]: _type = "Task" [ 1191.586175] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1191.594828] env[60024]: DEBUG oslo_vmware.api [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Task: {'id': task-4576332, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1192.097563] env[60024]: DEBUG oslo_vmware.exceptions [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Fault InvalidArgument not matched. {{(pid=60024) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1192.097836] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1192.098398] env[60024]: ERROR nova.compute.manager [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1192.098398] env[60024]: Faults: ['InvalidArgument'] [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] Traceback (most recent call last): [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] yield resources [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] self.driver.spawn(context, instance, image_meta, [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] self._fetch_image_if_missing(context, vi) [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] image_cache(vi, tmp_image_ds_loc) [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] vm_util.copy_virtual_disk( [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] session._wait_for_task(vmdk_copy_task) [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] return self.wait_for_task(task_ref) [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] return evt.wait() [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] result = hub.switch() [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] return self.greenlet.switch() [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] self.f(*self.args, **self.kw) [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] raise exceptions.translate_fault(task_info.error) [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] Faults: ['InvalidArgument'] [ 1192.098398] env[60024]: ERROR nova.compute.manager [instance: 076c3dd5-9043-456d-af24-0d2273321085] [ 1192.099759] env[60024]: INFO nova.compute.manager [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Terminating instance [ 1192.100274] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1192.100482] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1192.100711] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e3247c62-4310-4947-ba95-2a9231dde053 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.103225] env[60024]: DEBUG nova.compute.manager [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1192.103411] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1192.104146] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b93c5dea-de92-47a2-9fcd-f486b90185cf {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.111998] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1192.112254] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a3be96a2-dcb7-4a67-b35d-295e1868e641 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.114647] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1192.114868] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1192.115870] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-deb68911-417e-43c5-a2c4-1db4d64b50ba {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.121315] env[60024]: DEBUG oslo_vmware.api [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Waiting for the task: (returnval){ [ 1192.121315] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]52b9f436-1360-f55d-dcca-9ea5847cd589" [ 1192.121315] env[60024]: _type = "Task" [ 1192.121315] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1192.136939] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1192.137212] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Creating directory with path [datastore2] vmware_temp/171715b0-b80f-44af-9d3a-af44064789cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1192.137518] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-46c000b0-7696-47c1-be06-04f32b4f5c03 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.160077] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Created directory with path [datastore2] vmware_temp/171715b0-b80f-44af-9d3a-af44064789cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1192.160077] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Fetch image to [datastore2] vmware_temp/171715b0-b80f-44af-9d3a-af44064789cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1192.160343] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/171715b0-b80f-44af-9d3a-af44064789cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1192.161013] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18d91dd8-15d5-4d36-b80c-9247fe5be8bd {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.169406] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe85d04b-0a1a-4257-8ad7-ca2bd3e9fadc {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.179897] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-878a7240-d242-4ce4-a2be-3241bedf2476 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.215811] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-752da4e8-c928-4d0a-94de-3f882f095e6b {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.218477] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1192.218669] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1192.218840] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Deleting the datastore file [datastore2] 076c3dd5-9043-456d-af24-0d2273321085 {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1192.219090] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0c3b9f63-2a65-439c-bf5b-5c9197d12333 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.224854] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d14db8a0-f635-46b9-baff-6d89e4f798c1 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.227998] env[60024]: DEBUG oslo_vmware.api [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Waiting for the task: (returnval){ [ 1192.227998] env[60024]: value = "task-4576334" [ 1192.227998] env[60024]: _type = "Task" [ 1192.227998] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1192.236631] env[60024]: DEBUG oslo_vmware.api [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Task: {'id': task-4576334, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1192.250213] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1192.352632] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1192.353474] env[60024]: ERROR nova.compute.manager [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Traceback (most recent call last): [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] result = getattr(controller, method)(*args, **kwargs) [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return self._get(image_id) [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] resp, body = self.http_client.get(url, headers=header) [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return self.request(url, 'GET', **kwargs) [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return self._handle_response(resp) [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] raise exc.from_response(resp, resp.content) [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] During handling of the above exception, another exception occurred: [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Traceback (most recent call last): [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] yield resources [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] self.driver.spawn(context, instance, image_meta, [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1192.353474] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] self._fetch_image_if_missing(context, vi) [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] image_fetch(context, vi, tmp_image_ds_loc) [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] images.fetch_image( [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] metadata = IMAGE_API.get(context, image_ref) [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return session.show(context, image_id, [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] _reraise_translated_image_exception(image_id) [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] raise new_exc.with_traceback(exc_trace) [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] result = getattr(controller, method)(*args, **kwargs) [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return self._get(image_id) [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] resp, body = self.http_client.get(url, headers=header) [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return self.request(url, 'GET', **kwargs) [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return self._handle_response(resp) [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] raise exc.from_response(resp, resp.content) [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1192.354419] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1192.354419] env[60024]: INFO nova.compute.manager [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Terminating instance [ 1192.355449] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1192.355656] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1192.356345] env[60024]: DEBUG nova.compute.manager [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1192.356534] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1192.356768] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-425f5733-b91f-4dc2-98fe-b23bbd5106ee {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.359637] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9bb0322-c9ef-4dda-88f7-b1244e07c1d3 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.367227] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1192.367487] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-223f1c18-be73-4c2f-a714-740be26244e6 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.369885] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1192.370073] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1192.371072] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d586d398-14eb-431b-8089-bab1dbba6034 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.376731] env[60024]: DEBUG oslo_vmware.api [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Waiting for the task: (returnval){ [ 1192.376731] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]5277540b-5b31-1dd4-99dc-b9f29fe50eb3" [ 1192.376731] env[60024]: _type = "Task" [ 1192.376731] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1192.385599] env[60024]: DEBUG oslo_vmware.api [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]5277540b-5b31-1dd4-99dc-b9f29fe50eb3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1192.450717] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1192.450974] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1192.451104] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Deleting the datastore file [datastore2] fcf47169-eb7a-4644-bf3f-7150c44c247f {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1192.451379] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-74b6a6c7-b1db-4d4b-8f45-60d1587573a1 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.459311] env[60024]: DEBUG oslo_vmware.api [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Waiting for the task: (returnval){ [ 1192.459311] env[60024]: value = "task-4576336" [ 1192.459311] env[60024]: _type = "Task" [ 1192.459311] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1192.468009] env[60024]: DEBUG oslo_vmware.api [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Task: {'id': task-4576336, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1192.739445] env[60024]: DEBUG oslo_vmware.api [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Task: {'id': task-4576334, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081234} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1192.739923] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1192.739923] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1192.740090] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1192.740150] env[60024]: INFO nova.compute.manager [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1192.742320] env[60024]: DEBUG nova.compute.claims [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1192.742494] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1192.742703] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1192.768503] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1192.769246] env[60024]: DEBUG nova.compute.utils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Instance 076c3dd5-9043-456d-af24-0d2273321085 could not be found. {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1192.770682] env[60024]: DEBUG nova.compute.manager [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Instance disappeared during build. {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1192.770851] env[60024]: DEBUG nova.compute.manager [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1192.771019] env[60024]: DEBUG nova.compute.manager [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1192.771196] env[60024]: DEBUG nova.compute.manager [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1192.771382] env[60024]: DEBUG nova.network.neutron [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1192.888086] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1192.888355] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Creating directory with path [datastore2] vmware_temp/c9c6de7e-8cc6-452f-b0cf-5e39c9746444/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1192.888591] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e03bcf9f-a933-47a5-b65f-83b181630074 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.901578] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Created directory with path [datastore2] vmware_temp/c9c6de7e-8cc6-452f-b0cf-5e39c9746444/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1192.901794] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Fetch image to [datastore2] vmware_temp/c9c6de7e-8cc6-452f-b0cf-5e39c9746444/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1192.901965] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/c9c6de7e-8cc6-452f-b0cf-5e39c9746444/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1192.902788] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c5dbbbe-8208-4e51-93dd-4373ba6047c2 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.910746] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d40b673a-542a-4cbc-b3f7-4a8cfe0ead2f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.920754] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90a9d85a-6bd4-43ef-aa07-389ec098a5f6 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.954113] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2b79144-a72a-492a-b11e-37012c46c5ae {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.964354] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0c848410-23d7-44d8-b3f2-701e773e82d9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.972595] env[60024]: DEBUG oslo_vmware.api [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Task: {'id': task-4576336, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073653} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1192.972869] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1192.973056] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1192.973224] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1192.973446] env[60024]: INFO nova.compute.manager [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1192.975695] env[60024]: DEBUG nova.compute.claims [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1192.975863] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1192.976120] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1192.987927] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1193.002976] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1193.003744] env[60024]: DEBUG nova.compute.utils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Instance fcf47169-eb7a-4644-bf3f-7150c44c247f could not be found. {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1193.005327] env[60024]: DEBUG nova.compute.manager [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Instance disappeared during build. {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1193.005577] env[60024]: DEBUG nova.compute.manager [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1193.005659] env[60024]: DEBUG nova.compute.manager [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1193.005852] env[60024]: DEBUG nova.compute.manager [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1193.005972] env[60024]: DEBUG nova.network.neutron [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1193.083724] env[60024]: DEBUG nova.network.neutron [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Updating instance_info_cache with network_info: [] {{(pid=60024) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1193.093741] env[60024]: INFO nova.compute.manager [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Took 0.32 seconds to deallocate network for instance. [ 1193.100827] env[60024]: DEBUG nova.compute.manager [req-bb2dcf70-6a75-4737-9035-456126f26001 req-e9ae329a-fbdc-47b5-830e-19a5131fd1c4 service nova] [instance: 076c3dd5-9043-456d-af24-0d2273321085] Received event network-vif-deleted-4f1d6c09-ea11-4153-abe8-bb758f2f52cc {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1193.112269] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1193.113154] env[60024]: ERROR nova.compute.manager [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Traceback (most recent call last): [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] result = getattr(controller, method)(*args, **kwargs) [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return self._get(image_id) [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] resp, body = self.http_client.get(url, headers=header) [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return self.request(url, 'GET', **kwargs) [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return self._handle_response(resp) [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] raise exc.from_response(resp, resp.content) [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] During handling of the above exception, another exception occurred: [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Traceback (most recent call last): [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] yield resources [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] self.driver.spawn(context, instance, image_meta, [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1193.113154] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] self._fetch_image_if_missing(context, vi) [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] image_fetch(context, vi, tmp_image_ds_loc) [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] images.fetch_image( [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] metadata = IMAGE_API.get(context, image_ref) [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return session.show(context, image_id, [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] _reraise_translated_image_exception(image_id) [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] raise new_exc.with_traceback(exc_trace) [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] result = getattr(controller, method)(*args, **kwargs) [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return self._get(image_id) [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] resp, body = self.http_client.get(url, headers=header) [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return self.request(url, 'GET', **kwargs) [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return self._handle_response(resp) [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] raise exc.from_response(resp, resp.content) [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1193.114634] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.114634] env[60024]: INFO nova.compute.manager [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Terminating instance [ 1193.116046] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1193.116119] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1193.116766] env[60024]: DEBUG nova.compute.manager [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1193.116951] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1193.117202] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6db6bcae-55c0-4aa9-a5ef-265b310fa18b {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.120085] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dba66b36-3ec9-4b96-a9ef-d98738d2e354 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.130186] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1193.130616] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ed99dea8-036c-42a3-900a-62fa8bcde9d4 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.134044] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1193.134044] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1193.136091] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-16e8a9ec-a0f2-4521-af12-6f3387b1bac9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.145158] env[60024]: DEBUG oslo_vmware.api [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Waiting for the task: (returnval){ [ 1193.145158] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]52ce47fb-df67-d27a-757b-0d3a1b1b5e09" [ 1193.145158] env[60024]: _type = "Task" [ 1193.145158] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1193.149824] env[60024]: DEBUG oslo_concurrency.lockutils [None req-2fffb6e4-07b3-4aeb-a603-d93f7172df13 tempest-ServersNegativeTestMultiTenantJSON-1610443141 tempest-ServersNegativeTestMultiTenantJSON-1610443141-project-member] Lock "076c3dd5-9043-456d-af24-0d2273321085" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 539.150s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1193.156553] env[60024]: DEBUG neutronclient.v2_0.client [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60024) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1193.157936] env[60024]: ERROR nova.compute.manager [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Traceback (most recent call last): [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] result = getattr(controller, method)(*args, **kwargs) [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return self._get(image_id) [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] resp, body = self.http_client.get(url, headers=header) [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return self.request(url, 'GET', **kwargs) [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return self._handle_response(resp) [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] raise exc.from_response(resp, resp.content) [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] During handling of the above exception, another exception occurred: [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Traceback (most recent call last): [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] self.driver.spawn(context, instance, image_meta, [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] self._fetch_image_if_missing(context, vi) [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1193.157936] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] image_fetch(context, vi, tmp_image_ds_loc) [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] images.fetch_image( [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] metadata = IMAGE_API.get(context, image_ref) [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return session.show(context, image_id, [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] _reraise_translated_image_exception(image_id) [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] raise new_exc.with_traceback(exc_trace) [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] result = getattr(controller, method)(*args, **kwargs) [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return self._get(image_id) [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] resp, body = self.http_client.get(url, headers=header) [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return self.request(url, 'GET', **kwargs) [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return self._handle_response(resp) [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] raise exc.from_response(resp, resp.content) [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] During handling of the above exception, another exception occurred: [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Traceback (most recent call last): [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] self._build_and_run_instance(context, instance, image, [ 1193.158794] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] with excutils.save_and_reraise_exception(): [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] self.force_reraise() [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] raise self.value [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] with self.rt.instance_claim(context, instance, node, allocs, [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] self.abort() [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return f(*args, **kwargs) [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] self._unset_instance_host_and_node(instance) [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] instance.save() [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] updates, result = self.indirection_api.object_action( [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return cctxt.call(context, 'object_action', objinst=objinst, [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] result = self.transport._send( [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return self._driver.send(target, ctxt, message, [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] raise result [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] nova.exception_Remote.InstanceNotFound_Remote: Instance fcf47169-eb7a-4644-bf3f-7150c44c247f could not be found. [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Traceback (most recent call last): [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1193.159631] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return getattr(target, method)(*args, **kwargs) [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return fn(self, *args, **kwargs) [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] old_ref, inst_ref = db.instance_update_and_get_original( [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return f(*args, **kwargs) [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] with excutils.save_and_reraise_exception() as ectxt: [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] self.force_reraise() [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] raise self.value [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return f(*args, **kwargs) [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return f(context, *args, **kwargs) [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] raise exception.InstanceNotFound(instance_id=uuid) [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] nova.exception.InstanceNotFound: Instance fcf47169-eb7a-4644-bf3f-7150c44c247f could not be found. [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] During handling of the above exception, another exception occurred: [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Traceback (most recent call last): [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] ret = obj(*args, **kwargs) [ 1193.160564] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] exception_handler_v20(status_code, error_body) [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] raise client_exc(message=error_message, [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Neutron server returns request_ids: ['req-887390b8-b710-4e87-86e2-38889604a0b3'] [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] During handling of the above exception, another exception occurred: [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] Traceback (most recent call last): [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] self._deallocate_network(context, instance, requested_networks) [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] self.network_api.deallocate_for_instance( [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] data = neutron.list_ports(**search_opts) [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] ret = obj(*args, **kwargs) [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return self.list('ports', self.ports_path, retrieve_all, [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] ret = obj(*args, **kwargs) [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] for r in self._pagination(collection, path, **params): [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] res = self.get(path, params=params) [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] ret = obj(*args, **kwargs) [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return self.retry_request("GET", action, body=body, [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] ret = obj(*args, **kwargs) [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] return self.do_request(method, action, body=body, [ 1193.161655] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1193.162648] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] ret = obj(*args, **kwargs) [ 1193.162648] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1193.162648] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] self._handle_fault_response(status_code, replybody, resp) [ 1193.162648] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1193.162648] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] raise exception.Unauthorized() [ 1193.162648] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] nova.exception.Unauthorized: Not authorized. [ 1193.162648] env[60024]: ERROR nova.compute.manager [instance: fcf47169-eb7a-4644-bf3f-7150c44c247f] [ 1193.162875] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1193.163163] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Creating directory with path [datastore2] vmware_temp/87b0440f-dc07-4820-abf5-5a7d255a67e2/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1193.163417] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f318ed73-15a3-430a-baa5-662097222e19 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.186143] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Created directory with path [datastore2] vmware_temp/87b0440f-dc07-4820-abf5-5a7d255a67e2/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1193.186320] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Fetch image to [datastore2] vmware_temp/87b0440f-dc07-4820-abf5-5a7d255a67e2/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1193.186556] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/87b0440f-dc07-4820-abf5-5a7d255a67e2/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1193.187694] env[60024]: DEBUG oslo_concurrency.lockutils [None req-b6b9848f-f046-48a5-90f1-4a5a47dde445 tempest-ServersTestFqdnHostnames-309226866 tempest-ServersTestFqdnHostnames-309226866-project-member] Lock "fcf47169-eb7a-4644-bf3f-7150c44c247f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 425.668s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1193.188456] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3c0ebe8-34b1-40f1-8bfe-2faaa0e335fb {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.197395] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c48710a0-5de7-4d8c-bdca-298ddba47d6e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.210195] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43be8ee4-8671-4c90-a529-4191d9c499af {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.251854] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6974cf80-fe13-4e90-bd45-e214bec7a23f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.253556] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1193.253756] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1193.253925] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Deleting the datastore file [datastore2] d55ee9a1-6921-4648-ace2-f2da13c3523e {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1193.254193] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5e005046-97b1-48f3-98c6-746e1c560214 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.261664] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-500a4967-2999-4c4b-8d1d-cdf145b8b58f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.265506] env[60024]: DEBUG oslo_vmware.api [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Waiting for the task: (returnval){ [ 1193.265506] env[60024]: value = "task-4576338" [ 1193.265506] env[60024]: _type = "Task" [ 1193.265506] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1193.275753] env[60024]: DEBUG oslo_vmware.api [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Task: {'id': task-4576338, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1193.285626] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1193.392629] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1193.393492] env[60024]: ERROR nova.compute.manager [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Traceback (most recent call last): [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] result = getattr(controller, method)(*args, **kwargs) [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return self._get(image_id) [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] resp, body = self.http_client.get(url, headers=header) [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return self.request(url, 'GET', **kwargs) [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return self._handle_response(resp) [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] raise exc.from_response(resp, resp.content) [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] During handling of the above exception, another exception occurred: [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Traceback (most recent call last): [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] yield resources [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] self.driver.spawn(context, instance, image_meta, [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1193.393492] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] self._fetch_image_if_missing(context, vi) [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] image_fetch(context, vi, tmp_image_ds_loc) [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] images.fetch_image( [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] metadata = IMAGE_API.get(context, image_ref) [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return session.show(context, image_id, [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] _reraise_translated_image_exception(image_id) [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] raise new_exc.with_traceback(exc_trace) [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] result = getattr(controller, method)(*args, **kwargs) [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return self._get(image_id) [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] resp, body = self.http_client.get(url, headers=header) [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return self.request(url, 'GET', **kwargs) [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return self._handle_response(resp) [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] raise exc.from_response(resp, resp.content) [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1193.394473] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1193.394473] env[60024]: INFO nova.compute.manager [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Terminating instance [ 1193.395464] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1193.396086] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1193.396365] env[60024]: DEBUG nova.compute.manager [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1193.396563] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1193.396788] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a25c7574-75bb-4324-9287-b6ff723944e4 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.399794] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-846ac95f-e2f2-43c6-8ff4-cc4b50057d0e {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.407902] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1193.408210] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8b6a62d5-27db-4d6f-8196-4282182c325c {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.410679] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1193.410856] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1193.411864] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d3e7c7ad-125b-4f4c-868f-8c372b029bbf {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.417687] env[60024]: DEBUG oslo_vmware.api [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Waiting for the task: (returnval){ [ 1193.417687] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]52bf863d-c11f-26d3-e0d8-d37740438adb" [ 1193.417687] env[60024]: _type = "Task" [ 1193.417687] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1193.434046] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1193.434333] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Creating directory with path [datastore2] vmware_temp/cdf3a18c-5ba1-4fb3-b840-47923f9a4141/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1193.434571] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4843c114-0d58-45f9-93d3-10d725323145 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.448156] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Created directory with path [datastore2] vmware_temp/cdf3a18c-5ba1-4fb3-b840-47923f9a4141/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1193.448383] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Fetch image to [datastore2] vmware_temp/cdf3a18c-5ba1-4fb3-b840-47923f9a4141/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1193.448552] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/cdf3a18c-5ba1-4fb3-b840-47923f9a4141/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1193.449387] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43371b1b-4b90-4aca-8c68-8dffe3626d30 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.457824] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdabf361-ddf5-49ed-9f8e-547a7669763d {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.468370] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f83f4ef6-d213-4f4a-b97a-d454e0cd89af {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.502922] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa6b9983-0999-47b6-96d0-150caac99cb8 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.505657] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1193.505911] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1193.506134] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Deleting the datastore file [datastore2] e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1193.506379] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2be11fc0-b440-4f21-b7fc-747a8805085f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.512762] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0dee3a2f-e426-4783-b6a2-84f2b599f577 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.516042] env[60024]: DEBUG oslo_vmware.api [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Waiting for the task: (returnval){ [ 1193.516042] env[60024]: value = "task-4576340" [ 1193.516042] env[60024]: _type = "Task" [ 1193.516042] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1193.524806] env[60024]: DEBUG oslo_vmware.api [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Task: {'id': task-4576340, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1193.542946] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1193.653244] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Releasing lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1193.654237] env[60024]: ERROR nova.compute.manager [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Traceback (most recent call last): [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] result = getattr(controller, method)(*args, **kwargs) [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return self._get(image_id) [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] resp, body = self.http_client.get(url, headers=header) [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return self.request(url, 'GET', **kwargs) [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return self._handle_response(resp) [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] raise exc.from_response(resp, resp.content) [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] During handling of the above exception, another exception occurred: [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Traceback (most recent call last): [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] yield resources [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] self.driver.spawn(context, instance, image_meta, [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1193.654237] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] self._fetch_image_if_missing(context, vi) [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] image_fetch(context, vi, tmp_image_ds_loc) [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] images.fetch_image( [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] metadata = IMAGE_API.get(context, image_ref) [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return session.show(context, image_id, [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] _reraise_translated_image_exception(image_id) [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] raise new_exc.with_traceback(exc_trace) [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] result = getattr(controller, method)(*args, **kwargs) [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return self._get(image_id) [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] resp, body = self.http_client.get(url, headers=header) [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return self.request(url, 'GET', **kwargs) [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return self._handle_response(resp) [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] raise exc.from_response(resp, resp.content) [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1193.655229] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1193.655229] env[60024]: INFO nova.compute.manager [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Terminating instance [ 1193.656619] env[60024]: DEBUG oslo_concurrency.lockutils [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Acquired lock "[datastore2] devstack-image-cache_base/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc.vmdk" {{(pid=60024) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1193.656831] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1193.657506] env[60024]: DEBUG nova.compute.manager [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Start destroying the instance on the hypervisor. {{(pid=60024) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1193.660048] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Destroying instance {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1193.660048] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3105a4cb-ceda-468b-ad72-fc212852a075 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.661014] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7783a27c-2188-4bbe-9f9a-bb11fcc5bae5 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.670188] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Unregistering the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1193.670523] env[60024]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ff87298d-304e-4d08-9257-fe0b7cb898e7 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.673594] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1193.673923] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=60024) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1193.675127] env[60024]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-05890238-4255-4a5e-a861-5a6f9024a408 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.681926] env[60024]: DEBUG oslo_vmware.api [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Waiting for the task: (returnval){ [ 1193.681926] env[60024]: value = "session[52034186-32b2-4163-5772-6df9eda9abbc]5279bb0a-d0fd-4037-f865-57e5757d6efb" [ 1193.681926] env[60024]: _type = "Task" [ 1193.681926] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1193.692379] env[60024]: DEBUG oslo_vmware.api [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Task: {'id': session[52034186-32b2-4163-5772-6df9eda9abbc]5279bb0a-d0fd-4037-f865-57e5757d6efb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1193.748646] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Unregistered the VM {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1193.749077] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Deleting contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1193.749077] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Deleting the datastore file [datastore2] 54ded864-1c3e-4a47-968f-ca597c82cb87 {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1193.749369] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6a4326cd-bcb6-43e9-bbae-00a7b49dae14 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1193.758230] env[60024]: DEBUG oslo_vmware.api [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Waiting for the task: (returnval){ [ 1193.758230] env[60024]: value = "task-4576342" [ 1193.758230] env[60024]: _type = "Task" [ 1193.758230] env[60024]: } to complete. {{(pid=60024) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1193.766685] env[60024]: DEBUG oslo_vmware.api [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Task: {'id': task-4576342, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1193.775409] env[60024]: DEBUG oslo_vmware.api [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Task: {'id': task-4576338, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077656} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1193.775682] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1193.775868] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1193.776060] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1193.776246] env[60024]: INFO nova.compute.manager [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Took 0.66 seconds to destroy the instance on the hypervisor. [ 1193.778805] env[60024]: DEBUG nova.compute.claims [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1193.779017] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1193.779292] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1193.810366] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.031s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1193.811552] env[60024]: DEBUG nova.compute.utils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Instance d55ee9a1-6921-4648-ace2-f2da13c3523e could not be found. {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1193.813337] env[60024]: DEBUG nova.compute.manager [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Instance disappeared during build. {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1193.813574] env[60024]: DEBUG nova.compute.manager [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1193.813782] env[60024]: DEBUG nova.compute.manager [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1193.813976] env[60024]: DEBUG nova.compute.manager [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1193.814223] env[60024]: DEBUG nova.network.neutron [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1193.912466] env[60024]: DEBUG neutronclient.v2_0.client [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60024) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1193.914158] env[60024]: ERROR nova.compute.manager [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Traceback (most recent call last): [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] result = getattr(controller, method)(*args, **kwargs) [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return self._get(image_id) [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] resp, body = self.http_client.get(url, headers=header) [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return self.request(url, 'GET', **kwargs) [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return self._handle_response(resp) [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] raise exc.from_response(resp, resp.content) [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] During handling of the above exception, another exception occurred: [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Traceback (most recent call last): [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] self.driver.spawn(context, instance, image_meta, [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] self._fetch_image_if_missing(context, vi) [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1193.914158] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] image_fetch(context, vi, tmp_image_ds_loc) [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] images.fetch_image( [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] metadata = IMAGE_API.get(context, image_ref) [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return session.show(context, image_id, [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] _reraise_translated_image_exception(image_id) [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] raise new_exc.with_traceback(exc_trace) [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] result = getattr(controller, method)(*args, **kwargs) [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return self._get(image_id) [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] resp, body = self.http_client.get(url, headers=header) [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return self.request(url, 'GET', **kwargs) [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return self._handle_response(resp) [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] raise exc.from_response(resp, resp.content) [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] During handling of the above exception, another exception occurred: [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Traceback (most recent call last): [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] self._build_and_run_instance(context, instance, image, [ 1193.915223] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] with excutils.save_and_reraise_exception(): [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] self.force_reraise() [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] raise self.value [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] with self.rt.instance_claim(context, instance, node, allocs, [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] self.abort() [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return f(*args, **kwargs) [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] self._unset_instance_host_and_node(instance) [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] instance.save() [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] updates, result = self.indirection_api.object_action( [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return cctxt.call(context, 'object_action', objinst=objinst, [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] result = self.transport._send( [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return self._driver.send(target, ctxt, message, [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] raise result [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] nova.exception_Remote.InstanceNotFound_Remote: Instance d55ee9a1-6921-4648-ace2-f2da13c3523e could not be found. [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Traceback (most recent call last): [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1193.916240] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return getattr(target, method)(*args, **kwargs) [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return fn(self, *args, **kwargs) [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] old_ref, inst_ref = db.instance_update_and_get_original( [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return f(*args, **kwargs) [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] with excutils.save_and_reraise_exception() as ectxt: [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] self.force_reraise() [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] raise self.value [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return f(*args, **kwargs) [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return f(context, *args, **kwargs) [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] raise exception.InstanceNotFound(instance_id=uuid) [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] nova.exception.InstanceNotFound: Instance d55ee9a1-6921-4648-ace2-f2da13c3523e could not be found. [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] During handling of the above exception, another exception occurred: [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Traceback (most recent call last): [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] ret = obj(*args, **kwargs) [ 1193.917269] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] exception_handler_v20(status_code, error_body) [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] raise client_exc(message=error_message, [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Neutron server returns request_ids: ['req-f80fa5ac-0b95-4a01-94df-7b8491e8a98f'] [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] During handling of the above exception, another exception occurred: [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] Traceback (most recent call last): [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] self._deallocate_network(context, instance, requested_networks) [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] self.network_api.deallocate_for_instance( [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] data = neutron.list_ports(**search_opts) [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] ret = obj(*args, **kwargs) [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return self.list('ports', self.ports_path, retrieve_all, [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] ret = obj(*args, **kwargs) [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] for r in self._pagination(collection, path, **params): [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] res = self.get(path, params=params) [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] ret = obj(*args, **kwargs) [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return self.retry_request("GET", action, body=body, [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] ret = obj(*args, **kwargs) [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] return self.do_request(method, action, body=body, [ 1193.918550] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1193.919554] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] ret = obj(*args, **kwargs) [ 1193.919554] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1193.919554] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] self._handle_fault_response(status_code, replybody, resp) [ 1193.919554] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1193.919554] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] raise exception.Unauthorized() [ 1193.919554] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] nova.exception.Unauthorized: Not authorized. [ 1193.919554] env[60024]: ERROR nova.compute.manager [instance: d55ee9a1-6921-4648-ace2-f2da13c3523e] [ 1193.939590] env[60024]: DEBUG oslo_concurrency.lockutils [None req-645fc93b-ea81-455c-ada2-ce9f7feb6a40 tempest-DeleteServersAdminTestJSON-4043494 tempest-DeleteServersAdminTestJSON-4043494-project-member] Lock "d55ee9a1-6921-4648-ace2-f2da13c3523e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 389.330s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1194.027598] env[60024]: DEBUG oslo_vmware.api [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Task: {'id': task-4576340, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075511} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1194.027813] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1194.027999] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1194.028188] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1194.028364] env[60024]: INFO nova.compute.manager [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1194.030579] env[60024]: DEBUG nova.compute.claims [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1194.030751] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1194.030959] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1194.057843] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1194.058623] env[60024]: DEBUG nova.compute.utils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Instance e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb could not be found. {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1194.060101] env[60024]: DEBUG nova.compute.manager [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Instance disappeared during build. {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1194.060271] env[60024]: DEBUG nova.compute.manager [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1194.060434] env[60024]: DEBUG nova.compute.manager [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1194.060601] env[60024]: DEBUG nova.compute.manager [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1194.060763] env[60024]: DEBUG nova.network.neutron [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1194.159622] env[60024]: DEBUG neutronclient.v2_0.client [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60024) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1194.161181] env[60024]: ERROR nova.compute.manager [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Traceback (most recent call last): [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] result = getattr(controller, method)(*args, **kwargs) [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return self._get(image_id) [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] resp, body = self.http_client.get(url, headers=header) [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return self.request(url, 'GET', **kwargs) [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return self._handle_response(resp) [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] raise exc.from_response(resp, resp.content) [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] During handling of the above exception, another exception occurred: [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Traceback (most recent call last): [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] self.driver.spawn(context, instance, image_meta, [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] self._fetch_image_if_missing(context, vi) [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1194.161181] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] image_fetch(context, vi, tmp_image_ds_loc) [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] images.fetch_image( [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] metadata = IMAGE_API.get(context, image_ref) [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return session.show(context, image_id, [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] _reraise_translated_image_exception(image_id) [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] raise new_exc.with_traceback(exc_trace) [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] result = getattr(controller, method)(*args, **kwargs) [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return self._get(image_id) [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] resp, body = self.http_client.get(url, headers=header) [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return self.request(url, 'GET', **kwargs) [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return self._handle_response(resp) [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] raise exc.from_response(resp, resp.content) [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] During handling of the above exception, another exception occurred: [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Traceback (most recent call last): [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] self._build_and_run_instance(context, instance, image, [ 1194.162253] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] with excutils.save_and_reraise_exception(): [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] self.force_reraise() [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] raise self.value [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] with self.rt.instance_claim(context, instance, node, allocs, [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] self.abort() [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return f(*args, **kwargs) [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] self._unset_instance_host_and_node(instance) [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] instance.save() [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] updates, result = self.indirection_api.object_action( [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return cctxt.call(context, 'object_action', objinst=objinst, [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] result = self.transport._send( [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return self._driver.send(target, ctxt, message, [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] raise result [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] nova.exception_Remote.InstanceNotFound_Remote: Instance e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb could not be found. [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Traceback (most recent call last): [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1194.163252] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return getattr(target, method)(*args, **kwargs) [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return fn(self, *args, **kwargs) [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] old_ref, inst_ref = db.instance_update_and_get_original( [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return f(*args, **kwargs) [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] with excutils.save_and_reraise_exception() as ectxt: [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] self.force_reraise() [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] raise self.value [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return f(*args, **kwargs) [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return f(context, *args, **kwargs) [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] raise exception.InstanceNotFound(instance_id=uuid) [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] nova.exception.InstanceNotFound: Instance e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb could not be found. [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] During handling of the above exception, another exception occurred: [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Traceback (most recent call last): [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] ret = obj(*args, **kwargs) [ 1194.164277] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] exception_handler_v20(status_code, error_body) [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] raise client_exc(message=error_message, [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Neutron server returns request_ids: ['req-6d161646-201f-43b7-af41-ad76d93e5c31'] [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] During handling of the above exception, another exception occurred: [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] Traceback (most recent call last): [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] self._deallocate_network(context, instance, requested_networks) [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] self.network_api.deallocate_for_instance( [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] data = neutron.list_ports(**search_opts) [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] ret = obj(*args, **kwargs) [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return self.list('ports', self.ports_path, retrieve_all, [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] ret = obj(*args, **kwargs) [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] for r in self._pagination(collection, path, **params): [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] res = self.get(path, params=params) [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] ret = obj(*args, **kwargs) [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return self.retry_request("GET", action, body=body, [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] ret = obj(*args, **kwargs) [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] return self.do_request(method, action, body=body, [ 1194.165489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1194.166489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] ret = obj(*args, **kwargs) [ 1194.166489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1194.166489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] self._handle_fault_response(status_code, replybody, resp) [ 1194.166489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1194.166489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] raise exception.Unauthorized() [ 1194.166489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] nova.exception.Unauthorized: Not authorized. [ 1194.166489] env[60024]: ERROR nova.compute.manager [instance: e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb] [ 1194.182661] env[60024]: DEBUG oslo_concurrency.lockutils [None req-5bf779b6-749c-4443-bf58-24963e6d36c6 tempest-ServerPasswordTestJSON-1793502215 tempest-ServerPasswordTestJSON-1793502215-project-member] Lock "e8ed8e30-1b1d-4ab4-abd7-e68cb72916cb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 308.497s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1194.192812] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Preparing fetch location {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1194.193168] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Creating directory with path [datastore2] vmware_temp/31836b23-bc29-4f2e-8591-688bb961243f/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1194.193413] env[60024]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-98634a9e-d493-41b7-82fb-adfc6c69c7c9 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1194.205707] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Created directory with path [datastore2] vmware_temp/31836b23-bc29-4f2e-8591-688bb961243f/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc {{(pid=60024) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1194.205902] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Fetch image to [datastore2] vmware_temp/31836b23-bc29-4f2e-8591-688bb961243f/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk {{(pid=60024) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1194.206089] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to [datastore2] vmware_temp/31836b23-bc29-4f2e-8591-688bb961243f/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk on the data store datastore2 {{(pid=60024) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1194.207108] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bcac915-7009-4028-9df1-71c51cd7656f {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1194.214245] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bba06432-970a-4fe4-a8b0-0573c1939f4a {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1194.224136] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2fc7620-57a3-4e3b-aa59-1716c256a3b0 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1194.257487] env[60024]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8c62866-5718-4162-9d5f-2a26725fbbce {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1194.271403] env[60024]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-045c865e-02c7-4b22-8838-1d3098ab3851 {{(pid=60024) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1194.273281] env[60024]: DEBUG oslo_vmware.api [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Task: {'id': task-4576342, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084885} completed successfully. {{(pid=60024) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1194.273516] env[60024]: DEBUG nova.virt.vmwareapi.ds_util [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Deleted the datastore file {{(pid=60024) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1194.273702] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Deleted contents of the VM from datastore datastore2 {{(pid=60024) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1194.273875] env[60024]: DEBUG nova.virt.vmwareapi.vmops [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Instance destroyed {{(pid=60024) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1194.274072] env[60024]: INFO nova.compute.manager [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1194.276284] env[60024]: DEBUG nova.compute.claims [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Aborting claim: {{(pid=60024) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1194.276480] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1194.276693] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1194.298020] env[60024]: DEBUG nova.virt.vmwareapi.images [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Downloading image file data ce78d8ba-df84-4ce9-9b5e-632fda86b4cc to the data store datastore2 {{(pid=60024) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1194.304805] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1194.305610] env[60024]: DEBUG nova.compute.utils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Instance 54ded864-1c3e-4a47-968f-ca597c82cb87 could not be found. {{(pid=60024) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1194.307113] env[60024]: DEBUG nova.compute.manager [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Instance disappeared during build. {{(pid=60024) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1194.307282] env[60024]: DEBUG nova.compute.manager [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Unplugging VIFs for instance {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1194.307628] env[60024]: DEBUG nova.compute.manager [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=60024) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1194.307628] env[60024]: DEBUG nova.compute.manager [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Deallocating network for instance {{(pid=60024) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1194.307785] env[60024]: DEBUG nova.network.neutron [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] deallocate_for_instance() {{(pid=60024) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1194.402207] env[60024]: DEBUG oslo_vmware.rw_handles [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/31836b23-bc29-4f2e-8591-688bb961243f/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1194.466499] env[60024]: DEBUG neutronclient.v2_0.client [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=60024) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1194.468054] env[60024]: ERROR nova.compute.manager [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Traceback (most recent call last): [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] result = getattr(controller, method)(*args, **kwargs) [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return self._get(image_id) [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] resp, body = self.http_client.get(url, headers=header) [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return self.request(url, 'GET', **kwargs) [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return self._handle_response(resp) [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] raise exc.from_response(resp, resp.content) [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] During handling of the above exception, another exception occurred: [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Traceback (most recent call last): [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] self.driver.spawn(context, instance, image_meta, [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] self._fetch_image_if_missing(context, vi) [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1194.468054] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] image_fetch(context, vi, tmp_image_ds_loc) [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] images.fetch_image( [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] metadata = IMAGE_API.get(context, image_ref) [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return session.show(context, image_id, [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] _reraise_translated_image_exception(image_id) [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] raise new_exc.with_traceback(exc_trace) [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] result = getattr(controller, method)(*args, **kwargs) [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return self._get(image_id) [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] resp, body = self.http_client.get(url, headers=header) [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return self.request(url, 'GET', **kwargs) [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return self._handle_response(resp) [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] raise exc.from_response(resp, resp.content) [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] nova.exception.ImageNotAuthorized: Not authorized for image ce78d8ba-df84-4ce9-9b5e-632fda86b4cc. [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] During handling of the above exception, another exception occurred: [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Traceback (most recent call last): [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] self._build_and_run_instance(context, instance, image, [ 1194.469050] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] with excutils.save_and_reraise_exception(): [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] self.force_reraise() [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] raise self.value [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] with self.rt.instance_claim(context, instance, node, allocs, [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] self.abort() [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return f(*args, **kwargs) [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] self._unset_instance_host_and_node(instance) [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] instance.save() [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] updates, result = self.indirection_api.object_action( [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return cctxt.call(context, 'object_action', objinst=objinst, [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] result = self.transport._send( [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return self._driver.send(target, ctxt, message, [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] raise result [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] nova.exception_Remote.InstanceNotFound_Remote: Instance 54ded864-1c3e-4a47-968f-ca597c82cb87 could not be found. [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Traceback (most recent call last): [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1194.470016] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return getattr(target, method)(*args, **kwargs) [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return fn(self, *args, **kwargs) [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] old_ref, inst_ref = db.instance_update_and_get_original( [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return f(*args, **kwargs) [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] with excutils.save_and_reraise_exception() as ectxt: [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] self.force_reraise() [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] raise self.value [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return f(*args, **kwargs) [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return f(context, *args, **kwargs) [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] raise exception.InstanceNotFound(instance_id=uuid) [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] nova.exception.InstanceNotFound: Instance 54ded864-1c3e-4a47-968f-ca597c82cb87 could not be found. [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] During handling of the above exception, another exception occurred: [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Traceback (most recent call last): [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] ret = obj(*args, **kwargs) [ 1194.471025] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] exception_handler_v20(status_code, error_body) [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] raise client_exc(message=error_message, [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Neutron server returns request_ids: ['req-0edab96d-dd67-45aa-8414-08b2c9522331'] [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] During handling of the above exception, another exception occurred: [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] Traceback (most recent call last): [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] self._deallocate_network(context, instance, requested_networks) [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] self.network_api.deallocate_for_instance( [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] data = neutron.list_ports(**search_opts) [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] ret = obj(*args, **kwargs) [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return self.list('ports', self.ports_path, retrieve_all, [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] ret = obj(*args, **kwargs) [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] for r in self._pagination(collection, path, **params): [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] res = self.get(path, params=params) [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] ret = obj(*args, **kwargs) [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return self.retry_request("GET", action, body=body, [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] ret = obj(*args, **kwargs) [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] return self.do_request(method, action, body=body, [ 1194.472437] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1194.473430] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] ret = obj(*args, **kwargs) [ 1194.473430] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1194.473430] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] self._handle_fault_response(status_code, replybody, resp) [ 1194.473430] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1194.473430] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] raise exception.Unauthorized() [ 1194.473430] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] nova.exception.Unauthorized: Not authorized. [ 1194.473430] env[60024]: ERROR nova.compute.manager [instance: 54ded864-1c3e-4a47-968f-ca597c82cb87] [ 1194.473430] env[60024]: DEBUG oslo_vmware.rw_handles [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Completed reading data from the image iterator. {{(pid=60024) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1194.473430] env[60024]: DEBUG oslo_vmware.rw_handles [None req-42128840-7836-452c-b26a-f75d3184a2f3 tempest-AttachInterfacesV270Test-252724916 tempest-AttachInterfacesV270Test-252724916-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/31836b23-bc29-4f2e-8591-688bb961243f/ce78d8ba-df84-4ce9-9b5e-632fda86b4cc/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=60024) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1194.492532] env[60024]: DEBUG oslo_concurrency.lockutils [None req-683d5b43-8eb8-4924-b841-8951f706b180 tempest-AttachVolumeNegativeTest-692012757 tempest-AttachVolumeNegativeTest-692012757-project-member] Lock "54ded864-1c3e-4a47-968f-ca597c82cb87" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 306.803s {{(pid=60024) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1204.757405] env[60024]: DEBUG nova.compute.manager [req-e90fe7c3-14eb-4c99-a149-64ab9024475d req-651f24d3-a424-4689-8ce7-3b625ed6b42e service nova] [instance: 8b64034a-4d67-4605-adb8-a007dd735230] Received event network-vif-deleted-cab434dd-5abd-453d-b107-7c888181bdbe {{(pid=60024) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}}